CN117707670A - Data processing method and device for application program, storage medium and electronic device - Google Patents
Data processing method and device for application program, storage medium and electronic device Download PDFInfo
- Publication number
- CN117707670A CN117707670A CN202311699969.8A CN202311699969A CN117707670A CN 117707670 A CN117707670 A CN 117707670A CN 202311699969 A CN202311699969 A CN 202311699969A CN 117707670 A CN117707670 A CN 117707670A
- Authority
- CN
- China
- Prior art keywords
- target
- image data
- initial
- interaction event
- application program
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F9/00—Arrangements for program control, e.g. control units
- G06F9/06—Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
- G06F9/44—Arrangements for executing specific programs
- G06F9/451—Execution arrangements for user interfaces
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
- G06F3/04847—Interaction techniques to control parameter settings, e.g. interaction with sliders or dials
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F9/00—Arrangements for program control, e.g. control units
- G06F9/06—Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
- G06F9/46—Multiprogramming arrangements
- G06F9/48—Program initiating; Program switching, e.g. by interrupt
- G06F9/4806—Task transfer initiation or dispatching
- G06F9/4843—Task transfer initiation or dispatching by program, e.g. task dispatcher, supervisor, operating system
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Software Systems (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Human Computer Interaction (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
The application discloses a data processing method and device of an application program, a storage medium and an electronic device, and is applied to a target component of the application program, wherein the target component is formed by multiple processes. The method comprises the following steps: detecting a target interaction event by using a first process in the multiple processes, wherein the target interaction event is converted from an initial interaction event acted on a graphical user interface; responding to the target interaction event by utilizing a second process in the multiple processes, and executing image rendering operation on the initial image data of the application program to obtain target image data; and sending the target image data to the graphical user interface through the first process by utilizing the second process so as to update the initial display content of the application program on the graphical user interface into target display content corresponding to the target image data by utilizing the target base class. The application program data processing method and device solve the technical problem that data processing of the application program cannot be effectively performed.
Description
Technical Field
The present disclosure relates to the field of data processing, and in particular, to a data processing method and apparatus for an application program, a storage medium, and an electronic apparatus.
Background
Currently, a cross-platform application development framework (Qt) provides a traditional interface development technology (QWidget) and a mainstream interface development technology (Quick), wherein Qt QWidget is a framework for creating a traditional user interface, and Qt Quick is a mainstream development framework. Since Qt QWIdbget and Qt Quick each have advantages, the effect of compatibility between the two needs to be focused.
In the related art, embedding a Quick interface in a QWINGLE layout may be accomplished by creating a Qt QWINGLE application, with the adapter class provided by Qt. However, the method only introduces Quick technology in part of the development of a graphical User Interface (UI), is not essentially Qt Quick engineering, and limits the flexibility of Qt Quick in UI layout, so that the technical problem that the data processing of an application program cannot be effectively performed exists.
In view of the above problems, no effective solution has been proposed at present.
Disclosure of Invention
At least some embodiments of the present disclosure provide a data processing method, apparatus, storage medium and electronic device for an application program, so as to at least solve a technical problem that data processing of the application program cannot be effectively performed.
According to one embodiment of the present disclosure, a data processing method of an application program is provided, which is applied to a target component of the application program, where the target component is formed by multiple processes. The method may include: detecting a target interaction event by using a first process in the multiple processes, wherein the target interaction event is converted from an initial interaction event acted on a graphical user interface; responding to the target interaction event by utilizing a second process in the multiple processes, and executing image rendering operation on the initial image data of the application program to obtain target image data; and sending the target image data to the graphical user interface through the first process by utilizing the second process so as to update the initial display content of the application program on the graphical user interface into target display content corresponding to the target image data by utilizing the target base class, wherein the target base class is used for creating the graphical user interface, and the initial display content is generated by the initial image data.
According to one embodiment of the disclosure, another data processing method of an application program is further provided, and a graphical user interface is provided through a terminal device, wherein the graphical user interface is created based on a target base class. The method may include: acquiring an initial interaction event acted on a graphical user interface; converting the initial interaction event into a target interaction event matched with a target component of the application program, wherein the target component is formed by multiple processes, the target interaction event is detected by a first process in the multiple processes, and image rendering operation is performed on the initial image data of the application program by a second process in the multiple processes to obtain target image data; and updating the initial display content of the application program on the graphical user interface into target display content corresponding to the target image data by utilizing the target base class, wherein the target base class is used for creating the graphical user interface, and the initial display content is generated by the initial image data.
According to one embodiment of the present disclosure, there is further provided a data processing apparatus of an application program, applied to a target component of the application program, the target component being composed of multiple processes. The apparatus may include: the detection unit is used for detecting a target interaction event by utilizing a first process in the multiple processes, wherein the target interaction event is obtained by converting an initial interaction event acted on a graphical user interface; a rendering unit for responding to the target interaction event by using a second process in the multiple processes, and executing image rendering operation on the initial image data of the application program to obtain target image data; and the sending unit is used for sending the target image data to the graphical user interface through the first process by utilizing the second process so as to update the initial display content of the application program on the graphical user interface into the target display content corresponding to the target image data by utilizing the target base class, wherein the target base class is used for creating the graphical user interface, and the initial display content is generated by the initial image data.
According to one embodiment of the disclosure, there is further provided a data processing apparatus for another application program, where a graphical user interface is provided by a terminal device, the graphical user interface being created based on a target base class. The apparatus may include: an acquisition unit for acquiring an initial interaction event acting on the graphical user interface; the conversion unit is used for converting the initial interaction event into a target interaction event matched with a target component of the application program, wherein the target component is formed by multiple processes, the target interaction event is detected by a first process in the multiple processes, and an image rendering operation is performed on the initial image data of the application program by a second process in the multiple processes to obtain target image data; and the updating unit is used for updating the initial display content of the application program on the graphical user interface into target display content corresponding to the target image data by utilizing the target base class, wherein the target base class is used for creating the graphical user interface, and the initial display content is generated by the initial image data.
According to one embodiment of the present disclosure, there is also provided a computer-readable storage medium. The computer readable storage medium has stored therein a computer program, wherein the computer program is arranged to execute the data processing method of the application program of any one of the above-mentioned claims when run.
According to one embodiment of the present disclosure, there is also provided an electronic device. The electronic device may comprise a memory in which a computer program is stored and a processor arranged to run the computer program to perform the data processing method of the application program of any of the above.
In at least some embodiments of the present disclosure, a target interaction event is detected using a first process of a plurality of processes, wherein the target interaction event is converted from an initial interaction event acting on a graphical user interface; responding to the target interaction event by utilizing a second process in the multiple processes, and executing image rendering operation on the initial image data of the application program to obtain target image data; and sending the target image data to the graphical user interface through the first process by utilizing the second process so as to update the initial display content of the application program on the graphical user interface into target display content corresponding to the target image data by utilizing the target base class, wherein the target base class is used for creating the graphical user interface, and the initial display content is generated by the initial image data. That is, the present disclosure detects a target interaction event by using a first process in multiple processes, after detecting the target interaction event, performs an image rendering operation on initial image data of an application program by using a second process in multiple processes to obtain target image data, and after obtaining the target image data, updates initial display content of the application program on a graphical user interface to target display content corresponding to the target image data by using a target base class, thereby realizing a technical effect of effectively performing data processing of the application program, and solving a technical problem that data processing of the application program cannot be effectively performed.
Drawings
The accompanying drawings, which are included to provide a further understanding of the disclosure and are incorporated in and constitute a part of this application, illustrate embodiments of the disclosure and together with the description serve to explain the disclosure and do not constitute an undue limitation on the disclosure. In the drawings:
fig. 1 is a block diagram of a hardware structure of a mobile terminal of a data processing method of an application program according to an embodiment of the present disclosure;
FIG. 2 is a flow chart of a method of data processing for an application according to one embodiment of the present disclosure;
FIG. 3 is a flow chart of a method of data processing for another application according to one embodiment of the present disclosure;
FIG. 4 is a schematic diagram of a layout of a Qt Quick in a Qt QWIdget application according to one embodiment of the present disclosure;
FIG. 5 is a schematic diagram of a Qt Quick application layout according to one embodiment of the present disclosure;
FIG. 6 is a flow chart of a simulated browser feature according to one embodiment of the present disclosure;
FIG. 7 is a schematic diagram of relationships between processes and modules according to one embodiment of the present disclosure;
FIG. 8 is a schematic diagram of cross-process communication and interface invocation in accordance with one embodiment of the present disclosure;
FIG. 9 is a block diagram of a data processing apparatus of an application according to one embodiment of the present disclosure;
FIG. 10 is a block diagram of a data processing apparatus of another application according to one embodiment of the present disclosure;
fig. 11 is a schematic diagram of an electronic device according to an embodiment of the disclosure.
Detailed Description
In order that those skilled in the art will better understand the present disclosure, a technical solution in the embodiments of the present disclosure will be clearly and completely described below with reference to the accompanying drawings in which it is apparent that the described embodiments are only some embodiments of the present disclosure, not all embodiments. All other embodiments, which can be made by one of ordinary skill in the art without inventive effort, based on the embodiments in this disclosure, shall fall within the scope of the present disclosure.
It should be noted that the terms "first," "second," and the like in the description and claims of the present disclosure and in the foregoing figures are used for distinguishing between similar objects and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used may be interchanged where appropriate such that the embodiments of the disclosure described herein may be capable of operation in sequences other than those illustrated or described herein. Furthermore, the terms "comprises," "comprising," and "having," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus.
In one possible implementation, in the field of data processing, considering the technical accumulation of qwoidget and the development efficiency and interface effect brought by Qt Quick, if it is required to be compatible with both Qt qwoidget and Qt Quick technologies, a Quick interface can be conveniently embedded in qwoidget through an adaptation class (such as QQuickWidget or QQuickView) in a qwoidget application program.
However, after the inventor has practiced and studied carefully, it has been found that if the above method is adopted to achieve the effect of simultaneously compatible with Qt qwoidget and Qt Quick, since the above method is still essentially a Qt qwoidget application, only Qt Quick related technology is used on local UI requirements, and only Qt qwidgets and Qt Quick have unidirectional compatibility in terms of UI layout. Therefore, for the Qt qwoidget application, flexibility of the Qt Quick in the UI layout is limited, so that there is a technical problem that data processing of the application cannot be performed effectively.
Based on the foregoing, the embodiments of the present disclosure provide a data processing method for an application, where the method is a browser embedded framework (Chromium Embedded Framework, abbreviated as CEF) componentization scheme based on Qt Quick technology, that is, a target component (CEF 3) implemented based on a target base class (QuickItem), and by means of a plug-in mechanism of the QuickItem, the behavior of the QuickItem is kept highly consistent with that of a general quickcontrol, so that various capabilities provided by Qt Quick can be maximally utilized.
The technical problem that the data processing of the application program cannot be effectively performed is caused by considering that the related art introduces Quick technology on part of UI development. However, the embodiment of the disclosure achieves the purpose of integrating the function of CEF3 into a service module based on the self-defined visual control of the QuickItem without paying attention to any technical details related to CEF3, thereby realizing the technical effect of effectively performing data processing of an application program and solving the technical problem that the data processing of the application program cannot be effectively performed.
According to one embodiment of the present disclosure, there is provided an embodiment of a data processing method of an application program, it should be noted that the steps illustrated in the flowcharts of the drawings may be performed in a computer system such as a set of computer executable instructions, and that although a logical order is illustrated in the flowcharts, in some cases the steps illustrated or described may be performed in an order different from that herein.
The method embodiments may be performed in a mobile terminal, a computer terminal, or similar computing device. Taking the example of running on a mobile terminal, the mobile terminal can be a terminal device such as a smart phone (such as an Android mobile phone, an iOS mobile phone, etc.), a tablet computer, a palm computer, a mobile internet device (Mobile Internet Devices, abbreviated as MID), a PAD, a game console, etc. Fig. 1 is a block diagram of a hardware configuration of a mobile terminal of a data processing method of an application program according to an embodiment of the present disclosure. As shown in fig. 1, a mobile terminal may include one or more (only one is shown in fig. 1) processors 102 (the processors 102 may include, but are not limited to, a Central Processing Unit (CPU), a Graphics Processor (GPU), a Digital Signal Processing (DSP) chip, a Microprocessor (MCU), a programmable logic device (FPGA), a neural Network Processor (NPU), a Tensor Processor (TPU), an Artificial Intelligence (AI) type processor, etc.) and a memory 104 for storing data. Optionally, the mobile terminal may further include a transmission device 106, an input-output device 108, and a display device 110 for communication functions. It will be appreciated by those skilled in the art that the structure shown in fig. 1 is merely illustrative and not limiting of the structure of the mobile terminal described above. For example, the mobile terminal may also include more or fewer components than shown in fig. 1, or have a different configuration than shown in fig. 1.
The memory 104 may be used to store computer programs, such as software programs and modules of application software, such as computer programs corresponding to the data processing methods of the application programs in the embodiments of the present disclosure, and the processor 102 executes the computer programs stored in the memory 104, thereby performing various functional applications and data processing, that is, implementing the data processing methods of the application programs described above. Memory 104 may include high-speed random access memory, and may also include non-volatile memory, such as one or more magnetic storage devices, flash memory, or other non-volatile solid-state memory. In some examples, the memory 104 may further include memory remotely located relative to the processor 102, which may be connected to the mobile terminal via a network. Examples of such networks include, but are not limited to, the internet, intranets, local area networks, mobile communication networks, and combinations thereof.
The transmission device 106 is used to receive or transmit data via a network. Specific examples of the network described above may include a wireless network provided by a communication provider of the mobile terminal. In one example, the transmission device 106 includes a network adapter (Network Interface Controller, simply referred to as NIC) that can connect to other network devices through a base station to communicate with the internet. In one example, the transmission device 106 may be a Radio Frequency (RF) module, which is configured to communicate with the internet wirelessly.
The input in the input output device 108 may come from a plurality of human interface devices (Human Interface Device, abbreviated as HIDs). For example: keyboard and mouse, gamepad, other special game controllers (e.g., steering wheel, fishing pole, dance mat, remote control, etc.). Part of the ergonomic interface device may provide output functions in addition to input functions, such as: force feedback and vibration of the gamepad, audio output of the controller, etc.
The display device 110 may be, for example, a head-up display (HUD), a touch screen type Liquid Crystal Display (LCD), and a touch display (also referred to as a "touch screen" or "touch display"). The liquid crystal display may enable a user to interact with a user interface of the mobile terminal. In some embodiments, the mobile terminal has a Graphical User Interface (GUI), and the user may interact with the GUI by touching finger contacts and/or gestures on the touch-sensitive surface, where the man-machine interaction functions optionally include the following interactions: executable instructions for performing the above-described human-machine interaction functions, such as creating web pages, drawing, word processing, making electronic documents, games, video conferencing, instant messaging, sending and receiving electronic mail, talking interfaces, playing digital video, playing digital music, and/or web browsing, are configured/stored in a computer program product or readable storage medium executable by one or more processors.
It will be appreciated by those skilled in the art that the structure shown in fig. 1 is merely illustrative and not limiting of the structure of the mobile terminal described above. For example, the mobile terminal may also include more or fewer components than shown in fig. 1, or have a different configuration than shown in fig. 1.
In one possible implementation manner, the embodiment of the present disclosure provides a data processing method of an application program, applied to a target component of the application program, where the target component is formed by multiple processes, and fig. 2 is a flowchart of a data processing method of an application program according to one embodiment of the present disclosure, as shown in fig. 2, where the method may include the following steps:
in step S202, a target interaction event is detected by using a first process of the multiple processes, wherein the target interaction event is converted from an initial interaction event acting on the graphical user interface.
In the technical solution provided in step S202 of the present disclosure, the target interaction event may be detected by using the first process in the multiple processes. Wherein, the multiple processes can be multiple processes running simultaneously in the application program and independent of each other, and the multiple processes can include at least a Browser Process (Browser Process), a rendering Process (Render Process), and other processes. The first process may be a process to which a browser belongs in a multi-process, i.e., a browser process, and shares the same process with an application program. The target interactivity event may be converted from an initial interactivity event that is acted upon a graphical user interface. A graphical User Interface (UI) may be an interface that enables a user to interact with an application through graphical elements and controls. The initial interaction event may be a Qt Quick event acting on the gui, and may be referred to as a Qt Quick event, for example, an input event, or a mouse event, which is merely illustrated herein, and the type of the initial interaction event is not specifically limited.
In this embodiment, an initial interaction event acting on the graphical user interface may be obtained. After the initial interactivity event is acquired, the acquired initial interactivity event may be converted into a target interactivity event. After converting the initial interactivity event to the target interactivity event, the target interactivity event may be detected using a first process of the multiple processes.
Optionally, the embodiment obtains an initial interaction event, namely a Qt Quick event, acting on the UI interface. After the Qt Quick event is acquired, the Qt Quick event can be converted to obtain a target interaction event, and further the target interaction event can be detected by utilizing a browser process in multiple processes.
Step S204, responding to the target interaction event by utilizing a second process in the multiple processes, and executing image rendering operation on the initial image data of the application program to obtain target image data.
In the technical solution provided in step S204 of the present disclosure, after the target interaction event is detected by using the first process in the multiple processes, the image rendering operation may be performed on the initial image data of the application program by using the second process in the multiple processes to obtain the target image data in response to the detected target interaction event. Wherein the second process may be a rendering process in a multi-process. The application may be a Qt Quick application. The initial image data may be a picture frame before the application redraws. The target image data may be a picture frame after the application redrawn.
It should be noted that, the initial image data is rendered by the second process. Assuming that the image data corresponding to the current time point is selected as the initial image data, the initial image data may be considered as rendered according to a historical interaction event (an event before the current time point), for example, a uniform resource locator (Uniform Resource Locator, abbreviated as URL) address, a browse page content, and the like are loaded.
In this embodiment, after detecting the target interaction event using the browser process in the multiprocess, an image rendering operation may be performed on the initial image data of the Qt Quick application using the rendering process in the multiprocess in response to the detected target interaction event to obtain target image data.
Step S206, the second process is utilized to send the target image data to the graphic user interface through the first process, so that the initial display content of the application program on the graphic user interface is updated to the target display content corresponding to the target image data by utilizing the target base class.
In the technical solution provided in step S206 of the present disclosure, after the second process is utilized to respond to the target interaction event, the image rendering operation is performed on the initial image data of the application program, that is, the initial image data is updated as required, so as to obtain the target image data, the second process may be utilized to send the target image data to the graphical user interface through the first process, so that the initial display content of the application program on the graphical user interface is updated to the target display content corresponding to the target image data by utilizing the target base class. The target base class (QuickItem) may be a base class of a visual control, may be used for creating a graphical user interface, may provide capabilities such as interface rendering, interaction, event processing, and the like, and the QuickItem may also provide an initial loading page, that is, an initial screen displayed before the second process rendering data is obtained. The initial display content may be generated from the initial image data. The target display content may be generated from target image data.
In this embodiment, after the target image data is obtained, the rendering process may be used to send the target image data to the UI interface through the browser process, so as to update the initial display content of the Qt Quick application on the UI interface by using the target base class QuickItem to the target display content corresponding to the target image data.
Through steps S202 to S206 described above, the present disclosure detects a target interaction event using a first process of a plurality of processes, wherein the target interaction event is converted from an initial interaction event acting on a graphical user interface; responding to the target interaction event by utilizing a second process in the multiple processes, and executing image rendering operation on the initial image data of the application program to obtain target image data; and sending the target image data to the graphical user interface through the first process by utilizing the second process so as to update the initial display content of the application program on the graphical user interface into target display content corresponding to the target image data by utilizing the target base class, wherein the target base class is used for creating the graphical user interface, and the initial display content is generated by the initial image data. That is, the present disclosure detects a target interaction event by using a first process in multiple processes, after detecting the target interaction event, performs an image rendering operation on initial image data of an application program by using a second process in multiple processes to obtain target image data, and after obtaining the target image data, updates initial display content of the application program on a graphical user interface to target display content corresponding to the target image data by using a target base class, thereby realizing a technical effect of effectively performing data processing of the application program, and solving a technical problem that data processing of the application program cannot be effectively performed.
The above-described methods of embodiments of the present disclosure are further described below.
As an optional embodiment, step S202, detecting the target interaction event with the first process of the multiple processes includes: detecting a target format of the converted initial interaction event by using a first process; and determining the converted initial interaction event as a target interaction event according to the target format which is the format matched with the target component.
In this embodiment, the target format of the converted initial interactivity event may be detected using a first process of the multiple processes. When the first process detects that the target format of the initial interaction event is the format matched with the target component, the converted initial interaction event can be determined to be the target interaction event in response to the fact that the target format is the format matched with the target component. The target component (CEF 3) may be a next-generation CEF of a multi-process framework based on an application programming interface (Application Programming Interface, abbreviated as API) of the browser application. The target format may be a format required by the target component CEF3 to properly handle the interaction event. The target interaction event may be referred to as a CEF3 event.
Optionally, this embodiment may perform a specific format conversion on the Qt Quick event upon detecting that the user triggers the initial interaction event, i.e. upon detecting the Qt Quick event. After format conversion is performed on the Qt Quick event, a browser process can be utilized to detect the target format of the Qt Quick event after format conversion. When the target format of the Qt Quick event after format conversion is matched with the format of the target component CEF3, the Qt Quick event after format conversion can be determined as a target interaction event, namely a CEF3 event.
It should be noted that, the embodiment may capture the Qt Quick event triggered by the user, and after capturing the Qt Quick event, may parse the Qt Quick event parameter. Because of different types of Qt Quick events, the parsed Qt Quick event parameters also change, for example, for mouse movement events (mouseMoveEvent), the parsed event parameters can be position parameters, key states, and the like. After analyzing the Qt Quick event parameters, the CEF3 event can be executed, and the analyzed Qt Quick event parameters are transferred, so that specific format conversion operation on the Qt Quick event is realized.
As an optional embodiment, step S204, performing an image rendering operation on the initial image data of the application program to obtain target image data by using a second process of the multiple processes to respond to the target interaction event, includes: determining a data range to be updated in the initial image data by utilizing a second process to respond to the target interaction event; and executing image rendering operation on the image data in the data range in the initial image data by using the second process to obtain target image data.
In this embodiment, after the target interaction event is detected using a first process of the multiple processes, a range of data to be updated in the initial image data may be determined using a second process of the multiple processes in response to the detected target interaction event. After determining the data range to be updated in the initial image data, image rendering operation may be performed on the image data within the data range in the initial image data by using the second process, to obtain target image data. The data range may be an update range of single frame data, and the data range may include at least full-frame update and partial update.
The response may include a visual effect (screen) and a non-visual effect (state). This embodiment highlights the processing of a picture (picture frame) in view of the fact that the processing of state data belongs to the basic operation of Webview.
It should be noted that, in order to prevent multithreading from accessing and modifying data simultaneously, and ensure integrity and consistency of data, a locking operation needs to be performed when a single frame of data is updated in full or partial. After performing the locking operation, the full-width update may update the frame buffer using the create image copy method (QImage: copy ()). The local update may update the designated area of the frame buffer using a drawing function (qpaint:: drawImage ()). After the single frame data is updated, a lock release operation can be performed, allowing other threads to access the data simultaneously, avoiding thread blocking and performance loss. Where QImage: copy () can be used to create a copy of the current image, returning a new object that contains the same pixel data as the original image. QPainter: drawImage () can be used to draw images at specified locations.
Alternatively, after detecting the CEF3 event by using the browser process, the embodiment may determine whether the data range to be updated in the initial image data is full-width update or partial update by using the rendering process in response to the detected CEF3 event. When the data range to be updated is determined to be full-width updated, image rendering operation can be performed on the image data in the full-width update in the initial image data by using the rendering process, so as to obtain target image data. When the data range to be updated is determined to be the local update, the image rendering operation can be performed on the image data in the local update in the initial image data by using the rendering process, so as to obtain the target image data.
As an optional embodiment, step S204, performing an image rendering operation on the initial image data of the application program to obtain target image data by using a second process of the multiple processes to respond to the target interaction event, includes: and responding to the target interaction event by utilizing the second process, and executing image rendering operation in the off-screen rendering mode on the initial image data to obtain target image data.
In this embodiment, after the target interaction event is detected by using the first process in the multiprocess, an image rendering operation in the off-screen rendering mode may be performed on the initial image data in response to the detected target interaction event by using the second process in the multiprocess, to obtain the target image data. Among them, an Off-Screen Rendering (abbreviated as OSR) mode may be a process of performing graphics Rendering outside a Screen, which may be used to create complex graphic effects or process a large amount of graphic data to mitigate the impact on the Rendering performance of a home Screen.
Alternatively, the embodiment may perform an image rendering operation in OSR mode on the initial image data to obtain the target image data using a rendering process in response to the detected CEF3 event after detecting the CEF3 event using the browser process.
The following further describes the image rendering operation performed on the initial image data in the off-screen rendering mode by using the second process to respond to the target interaction event to obtain the target image data.
As an alternative embodiment, performing, with the second process in response to the target interaction event, an image rendering operation in an off-screen rendering mode on the initial image data to obtain target image data, including: responding to the target interaction event by using a second process, and drawing a target viewport in an off-screen rendering mode; and executing image rendering operation on the initial image data in the target viewport by utilizing the second process to obtain target image data.
In this embodiment, after the target interaction event is detected using a first process of the multiple processes, the target viewport may be rendered in an off-screen rendering mode using a second process of the multiple processes in response to the detected target interaction event. After the target viewport is drawn in the off-screen rendering mode, image rendering operations may be performed on the initial image data in the target viewport using a second process to obtain target image data. The target viewport may be a visual frame of a current browser control (Webview), that is, a window drawn in an off-screen rendering mode, and may also be referred to as a windowless window.
It should be noted that, the target component CEF3 provides three window rendering modes of sub-window, independent window and no window. Wherein, the independent window does not meet the requirements of nested layout. The sub-window needs to be adapted to the size of the current assembly view port in real time, and the flexibility and the fluency are insufficient. The no frame is a frame drawn in OSR mode, which can provide the original frame of the page, but requires additional processing of browser events. This embodiment adopts the OSR rendering mode in view of rendering efficiency and flexibility.
Alternatively, this embodiment may draw the target viewport, i.e., no frame, in OSR mode using a rendering process in response to the detected CEF3 event after detecting the CEF3 event using a browser process. That is, the componentization scheme proposed in this embodiment draws a browser screen in an embedded form. After drawing the windowless frame in the OSR mode, an image rendering operation may be performed on the initial image data in the windowless frame using a rendering process to obtain target image data.
As an alternative embodiment, the method further comprises: invoking a visual target instance in the rendering framework by using a first process; determining a target object corresponding to the visual target instance in the first process; and calling a target interface corresponding to the target object in the rendering frame, and transmitting the target interaction event to the second process.
In this embodiment, a first process may be utilized to invoke a visualization target instance in a rendering framework. After invoking the visualization target instance in the rendering framework, a target object corresponding to the visualization target instance in the first process may be determined. After determining the target object in the first process, a target interface in the rendering framework corresponding to the target object may be invoked to transmit the target interaction event to the second process. The visualized target instance may be a visualized target component instance (QuickCefView) based on the Qt Quick technology, and may also be referred to as a Webview instance. The target object may be a browser object in a first process, and two browser objects corresponding to a WebView window are respectively located in the first process and the second process, which is the capability provided by the rendering framework, so that cross-process business processing can be performed. The target interface may be an interface class provided by the rendering framework, and may include at least CefBrowserHost, cefBrowser and CefFrame.
It should be noted that, the cefbrowser host represents a browser process to which the browser object belongs, and the interface can only be called in the browser process, and the cefbrowser host can be used for related operations of a browser window, such as focusing, window scaling, window hiding, keyboard event processing, mouse event processing, and the like. CefBrowser represents a browser object, and can provide related functions of the browser, such as forward, backward, reloading, canceling page loading and the like, a WebView window has two browser objects which belong to a browser process and a rendering process respectively, but in the rendering process, the interface can only be called in a main thread. CefFrame represents a visual window, and can provide some operations and information of the current page, such as loading the page, acquiring the page address, etc., the CefFrame belongs to a browser object, and the calling rule is the same as CefBrowser.
It should be noted that Webview instantiations are generated by calling the QuickCefView component, and a corresponding CefBrowser instance is created for each Webview instance. When the above interfaces are functionally divided, two other interface classes CefBrowser host and CefFrame are maintained in the CefBrowser, and the two interface classes and the CefBrowser are jointly responsible for the business layer operation of Webview, such as QuickItem event forwarding and user operation. The object, i.e., instance, in this embodiment is an instantiation of the class interface.
Optionally, the target interaction event of the first process in this embodiment is implemented by a basic function provided by the QuickItem module, and the componentization scheme of this embodiment may forward the event of the QuickItem to the second process according to the interface specification of the rendering framework.
As an alternative embodiment, the method further comprises: and determining the process corresponding to the visualization target instance in the multiple processes as a second process.
In this embodiment, each visualization target instance has a rendering process corresponding to it, and the rendering process corresponding to the visualization target instance in the multiple processes may be determined as the second process.
It should be noted that, each visualization target instance, that is, the Webview instance and the corresponding rendering process, perform two-way communication through Inter-process communication (Inter-Process Communication, abbreviated as IPC).
It should be noted that, in the Qt Quick application, there are a plurality of Webview windows provided by the target component CEF 3. Each Webview window has one instance of a client object (CefClient) and one CefBrowser. The componentization scheme of the embodiment can change the corresponding relation between the Webview instance and the rendering process through parameter configuration, and the target component supports that a plurality of Webview instances multiplex a single rendering process.
As an alternative embodiment, the method further comprises: and determining the process to which the application program belongs in the multiple processes as a first process.
In this embodiment, there is a unique browser process in the application of the target component, where the browser process is the process to which the current application belongs, and the browser process to which the application belongs in multiple processes may be determined as the first process.
In an embodiment of the disclosure, detecting a target interaction event by using a first process of multiple processes, wherein the target interaction event is converted from an initial interaction event acting on a graphical user interface; responding to the target interaction event by utilizing a second process in the multiple processes, and executing image rendering operation on the initial image data of the application program to obtain target image data; and sending the target image data to the graphical user interface through the first process by utilizing the second process so as to update the initial display content of the application program on the graphical user interface into target display content corresponding to the target image data by utilizing the target base class, wherein the target base class is used for creating the graphical user interface, and the initial display content is generated by the initial image data. That is, the present disclosure detects a target interaction event by using a first process in multiple processes, after detecting the target interaction event, performs an image rendering operation on initial image data of an application program by using a second process in multiple processes to obtain target image data, and after obtaining the target image data, updates initial display content of the application program on a graphical user interface to target display content corresponding to the target image data by using a target base class, thereby realizing a technical effect of effectively performing data processing of the application program, and solving a technical problem that data processing of the application program cannot be effectively performed.
In a possible implementation manner, the embodiment of the present disclosure further provides a data processing method of another application program from a side of a terminal device, and the terminal device provides a graphical user interface, where the graphical user interface is created based on a target base class, and fig. 3 is a flowchart of the data processing method of another application program according to one embodiment of the present disclosure, as shown in fig. 3, and the method may include the following steps:
step S302, an initial interaction event acting on the graphical user interface is acquired.
In the technical solution provided in step S302 of the present disclosure, an initial interaction event acting on a graphical user interface may be obtained by detecting an initial interaction event triggered by a user.
In this embodiment, after detecting the initial interaction event triggered by the user, that is, after detecting the Qt Quick event, the Qt Quick event acting on the UI interface may be acquired.
Step S304, converting the initial interaction event into a target interaction event matched with a target component of the application program, wherein the target component is composed of multiple processes, the target interaction event is detected by a first process in the multiple processes, and an image rendering operation is performed on the initial image data of the application program by a second process in the multiple processes, so as to obtain target image data.
In the technical solution provided in step S304 of the present disclosure, after the initial interaction event acting on the graphical user interface is acquired, the acquired initial interaction event may be converted into a target interaction event matched with a target component of the application program. Wherein the target component may be composed of multiple processes. The target interaction event may be detected by a first process of the multiprocess and responded to by a second process of the multiprocess, performing an image rendering operation on the initial image data of the application to obtain target image data.
In this embodiment, after the Qt Quick event is obtained, the format of the Qt Quick event may be converted to a target format, which is a format that matches the target component CEF3 of the Qt Quick application. After converting the format of the Qt Quick event to the target format, conversion of the Qt Quick event to a target interaction event, CEF3 event, that matches the target component CEF3 of the Qt Quick application may be accomplished.
Step S306, updating the initial display content of the application program on the graphical user interface into target display content corresponding to the target image data by utilizing a target base class, wherein the target base class is used for creating the graphical user interface, and the initial display content is generated by the initial image data.
In the technical solution provided in step S306 of the present disclosure, after converting the initial interaction event into a target interaction event matched with a target component of the application program, the initial display content of the application program on the graphical user interface may be updated to a target display content corresponding to the target image data by using the target base class. Wherein the target base class may be used to create a graphical user interface. The initial display content may be generated from the initial image data. The target display content may be generated from target image data.
In this embodiment, after converting the Qt Quick event into a CEF3 event that matches the target component CEF3 of the Qt Quick application, the initial display content of the Qt Quick application on the UI interface may be updated to the target display content corresponding to the target image data using the target base class QuickItem.
Through steps S302 to S306 described above, the present disclosure acquires an initial interaction event acting on a graphical user interface; converting the initial interaction event into a target interaction event matched with a target component of the application program, wherein the target component is formed by multiple processes, the target interaction event is detected by a first process in the multiple processes, and image rendering operation is performed on the initial image data of the application program by a second process in the multiple processes to obtain target image data; and updating the initial display content of the application program on the graphical user interface into target display content corresponding to the target image data by utilizing the target base class, wherein the target base class is used for creating the graphical user interface, and the initial display content is generated by the initial image data. That is, the present disclosure converts the obtained initial interaction event acting on the gui into a target interaction event matched with the target component of the application program, so as to obtain target image data, and further uses the target base class to update the initial display content of the application program on the gui into the target display content corresponding to the target image data, thereby realizing the technical effect of effectively performing data processing on the application program, and solving the technical problem that the data processing on the application program cannot be effectively performed.
The above-described methods of embodiments of the present disclosure are further described below.
As an optional embodiment, step S304, converting the initial interaction event into a target interaction event matched with a target component of the application program, includes: and converting the initial format of the initial interaction event into a format matched with the target component to obtain the target interaction event.
In this embodiment, after the initial interaction event acting on the graphical user interface is obtained, the initial format of the initial interaction event may be converted to a format that matches the target component of the application to obtain the target interaction event. The initial format may be a format before converting the initial interaction event.
Optionally, after acquiring the Qt Quick event, the embodiment may convert the initial format of the Qt Quick event into a target format that matches the target component CEF3 of the Qt Quick application to ensure that the target component CEF3 can process the event correctly.
As an alternative embodiment, the method further comprises: responding to an interface updating operation acted on a graphical user interface, and sending a target request, wherein the target request is used for requesting a target component to acquire target image data; and acquiring target image data returned by the target component through the first process in response to the target request.
In this embodiment, the target request is sent in response to an interface update operation acting on the graphical user interface. After sending the target request, target image data returned by the target component through the first process in response to the target request may be obtained. The interface updating operation may be an interface refreshing operation or an interface redrawing operation, which is only illustrated herein, and the form of the interface updating operation is not specifically limited. The target request may be a request to acquire target image data, and may be used to request acquisition of target image data from a target component.
Optionally, the embodiment transmits a target request to acquire target image data to the target component CEF3 in response to an interface refresh operation or an interface redraw operation acting on the UI interface. After sending the target request to the target component CEF3, the target image data returned by the target component CEF3 through the browser process in response to the target request may be acquired. That is, the embodiment transmits a request to acquire a picture frame to the target component CEF3 in response to an interface refresh operation or an interface redraw operation acting on the UI interface. After sending the request for acquiring the picture frame to the target component CEF3, the target component CEF3 may acquire the picture frame returned by the browser process in response to the request for acquiring the picture frame. Based further on the acquired picture frames, scene Graph (SG) nodes of the UI interface may be updated.
Note that, scene Graph is a framework for graphics rendering and layout, and all elements in a Scene may be represented as graphic nodes, and the structure of the entire Scene may be described by the relationship between the nodes. The Scene Graph of the Qt Quick supports two rendering modes, namely a multithreading mode and a basic mode. According to different running environments, the Qt Quick can autonomously select a proper rendering mode, and has no sense on upper-layer business. In this embodiment, the Scene Graph rendering is referred to specifically as the multiline Cheng Xuanran mode unless otherwise specified.
Optionally, the embodiment obtains the target image data returned by the browser process in response to an interface refresh operation or an interface redraw operation acting on the UI interface, and the process may include updating the frame buffer, constructing the texture data, and updating the SG node data. The update frame buffer corresponds to sending a request for acquiring the picture frame to the target component CEF3, and acquiring the picture frame returned through the browser process. The update frame buffer may be constructed by performing a locking operation followed by constructing the update frame buffer using a copy of image data processing class (QImage), and finally performing a releasing operation. Since the frame buffer corresponds to the data to be drawn by the current UI, the texture data generated by the frame buffer can update the UI interface, that is, the graphics processor (Graphics Processing Unit, abbreviated as GPU) renders dependent texture data, so this embodiment needs to construct texture data, and a texture can be created using a create texture object function (QquickWindow:: create texture image). Based on the acquired picture frame, SG node data of the UI interface can be updated using a texture interface (QSG im texture) setting texture nodes.
In an embodiment of the present disclosure, an initial interaction event acting on a graphical user interface is obtained; converting the initial interaction event into a target interaction event matched with a target component of the application program, wherein the target component is formed by multiple processes, the target interaction event is detected by a first process in the multiple processes, and image rendering operation is performed on the initial image data of the application program by a second process in the multiple processes to obtain target image data; and updating the initial display content of the application program on the graphical user interface into target display content corresponding to the target image data by utilizing the target base class, wherein the target base class is used for creating the graphical user interface, and the initial display content is generated by the initial image data. That is, the present disclosure converts the obtained initial interaction event acting on the gui into a target interaction event matched with the target component of the application program, so as to obtain target image data, and further uses the target base class to update the initial display content of the application program on the gui into the target display content corresponding to the target image data, thereby realizing the technical effect of effectively performing data processing on the application program, and solving the technical problem that the data processing on the application program cannot be effectively performed.
The technical solutions of the embodiments of the present disclosure are further described by way of example with reference to the preferred embodiments. In particular, a CEF assembly method based on Qt Quick technology is further described.
Qt provides two interface development technologies of QWINGSTT and Quick, wherein Qt QWINGSTT is a traditional interface development drawn based on a central processing unit (Central Processing Unit, simply called CPU) and is used for creating a traditional user interface frame, so that low-performance hardware is convenient to be compatible, and the Qt QWINGSTT has wide application in the fields of military industry, security, aerospace and the like, but the Qt QWINGSTT technology is basically in a maintenance state currently, and Qt authorities do not update QWINGSTT any more. Qt Quick is similar to a mainstream webpage (Web) development framework, has development efficiency and interface effect equivalent to those of Web, and can fully utilize GPU resources, so that valuable thread resources of UI (user interface) are saved.
In view of the technological accumulation of QWINGSTgets, many projects will be compatible with both Qt QWINGSTget and Qt Quick technologies at the transition stage. That is, in the Qt QWIdget application, the Quick interface is conveniently embedded in the QWIdget through the QQuickWidget or QQuickView adaptation class. However, for the Qt quick2.0 application, qt does not provide an adaptation class like qquickwidgets.
The advantage of Qt Quick 2.0 is the unique rendering mechanism, declarative development language. The Webview provided by the CEF3 facilitates version updating and customizing of the service interface, wherein Webview is a generic term, is a browser control, and can be embedded into an application program for displaying web page content. Currently, in desktop application development, embedding a fifth version of hypertext markup language (Hypertext Markup Language version, abbreviated as HTML 5) in CEF3 is common. However, in the Qt Quick 2.0 application, there is also a lack of componentization schemes that use CEF3 in OSR mode.
In the related art, an open source component (qcifview), which is a CEF component based on the Qt QWidget technology, is used by creating the Qt QWidget application. Further with the adapter class provided by Qt, a Quick interface can be embedded in the QWINGET layout to realize compatibility of Qt Quick and CEF. However, due to the difference of the Qt underlying rendering mechanisms, the scheme is costly in terms of layout flexibility, quick rendering performance, and the like. The Webview in the latest version (Qt 5.14.1) of the desktop client (POPO) uses an open-source qcufview component, and the UI development adopts QWidget and Qt Quick technology stacks at the same time, so that the reconfiguration item is relatively conservative in technology selection. QCxView components still in open source are used by WebView in the client (A50: qt 5.15.2), qt QWIdget is used for UI development in the early stage of projects, and Qt Quick technology is introduced in the development of instant messaging (Instant Messaging, abbreviated as IM) modules. Both of the above products are essentially Qt qwoidget applications, using Qt Quick related technology only on local UI requirements.
FIG. 4 is a schematic diagram of a Qt Quick layout in a Qt QWIdget application in which QCEfView components in a UI layout using QWIdget technology are incompatible with UI layout using Quick 2.0 technology only locally, wherein QCefView is a QWIdget technology based CEF component, as shown in FIG. 4, according to one embodiment of the present disclosure. That is, there is only one-way compatibility of Qt QWIdbget with Qt Quick in terms of UI layout. Thus, there is a large limit to the flexibility of Qt Quick in UI layout for Qt QWIget applications.
As can be seen from FIG. 4, the scheme just introduces Quick technology in part of UI development, is not essentially Qt Quick engineering, and has defects in layout flexibility, rendering compatibility, performance and the like. In terms of layout flexibility, qcifview cannot be embedded into UI layout realized by Quick technology. In terms of rendering compatibility, a single top-level window supports only a single graphics API interface, e.g., two cross-platform image rendering APIs (VulKen and OpenGL) are incompatible. In terms of performance, the multithreaded rendering mechanism of Qt Quick 2.0 cannot be enabled, and the additional burden of GPU fragment processing cannot be avoided. In addition, qt QWIdbget is a mature desktop application development framework, and the workflow of Qt Quick is more biased to the Web front end, and two technical stacks have more severe requirements on the technical background of the developer. In summary, in the above solution, flexibility of the Qt Quick in the UI layout is limited greatly, so that there is a technical problem that data processing of the application program cannot be performed effectively.
In order to solve the above-mentioned problems, the disclosed embodiment provides a CEF component method based on Qt Quick technology, wherein CEF is an open-source item based on Google browser (Google chrome), and the embodiment provides the capability of embedding pages in Qt Quick 2.0 application program, namely, realizes a powerful browser function. FIG. 5 is a schematic diagram of a Qt Quick application layout in which Qt Quick engineering does not require the introduction of QWINGET related technology, and in a UI layout using Quick 2.0 technology, the component (Quick CefView) is a CEF3 component based on Qt Quick technology, as shown in FIG. 5, according to one embodiment of the present disclosure. The present disclosure maintains a high degree of consistency with generic Quick controls in terms of its behavior by virtue of a Quick item plugin mechanism, thereby maximizing the capabilities provided by the Qt Quick, such as Fluid user interface (Fluid UI), GPU rendering, and multi-line Cheng Xuanran, where Quick item is a base class of visual interface elements that can provide basic rendering, interaction, and event handling functions.
Embodiments of the present disclosure are directed to a method of componentization of CEF3 in OSR mode, the implementation of which can be divided into interactive, feedback, and multi-threaded processing. Wherein the interaction may be a change in user behavior and operating environment. The feedback may be a response of the interaction, i.e. a picture frame rendered in real time.
With respect to event handling, events herein are specifically interactive events. Since event processing is the basis of page response and redrawing, this embodiment can perform specific format conversion on the Qt Quick event after detecting that the user triggers an initial interaction event, that is, the Qt Quick 2.0 event. After format conversion is performed on the Qt Quick event, a browser process can be utilized to detect a target format of the Qt Quick event after format conversion. When the target format of the Qt Quick event after format conversion is matched with the format of the target component CEF3, the Qt Quick event after format conversion can be determined as a target interaction event, namely a CEF3 event. That is, CEF3 is not processed properly until the Qt Quick event is format converted.
Table 1 is a mapping table of Qt Quick 2.0 events and CEF3 events according to one embodiment of the present disclosure, as shown in Table 1, that captures user-triggered Qt Quick events, after which Qt Quick event parameters can be parsed. Because of different types of Qt Quick events, the parsed Qt Quick event parameters also change, for example, the parsed event parameters can be position parameters, key states and the like for the mouse movement event (mouseMoveEvent) of Qt Quick 2.0. After analyzing the Qt Quick event parameters, CEF3 events can be executed, for example, a mouse movement event (SendMouseMoveEvent) of CEF3 corresponding to the mouse movement event of Qt Quick 2.0 is executed, and the analyzed Qt Quick event parameters are transferred, so that specific format conversion operation on the Qt Quick event is realized.
As can be seen from table 1, for the Input Event (Input Event), the Input method Event (inputmethodEvent) of Qt Quick 2.0 corresponds to the Input method Event (ImeCommitText, imeSetComposition or ImeCancel Composition) of CEF 3. For Focus events (Focus events), the get Focus Event (Focus inevent) of Qt Quick 2.0 corresponds to the Focus Event (SetFocus) of CEF3 with the lose Focus Event (Focus outevent). For Mouse Event (Mouse Event), the Mouse pointer hover Event (hoverEnterprise Event), the Mouse pointer leave Event (hoverLeaveEvent), and the Mouse move Event (mouseMoveEvent) of Qt Quick 2.0 correspond to the Mouse move Event (SendMouseMoveEvent) of CEF3, the Mouse press Event (mouseResresevent) and the Mouse release Event (mouseReleaaseEvent) of Qt Quick 2.0 correspond to the Mouse click Event (SendMouseClickEvent) of CEF3, and the Mouse wheel Event (whaveEvent) of Qt Quick 2.0 corresponds to the Mouse wheel Event (SendMouseWheelements) of CEF 3. For hidden events (Hide Event) and Show events (Show Event), the Event (Event) of Qt Quick 2.0 corresponds to the hidden Event (WasHidden) of CEF 3. For window change Event (Resize Event), window change Event (geometry change) of Qt Quick 2.0 corresponds to window change Event (wasresize) of CEF 3.
Table 1 mapping relationship table of Qt Quick 2.0 event and CEF3 event
For multithreading, this embodiment applies to the target component CEF3 of the Qt Quick application, where the target component CEF3 is composed of multiple processes, i.e., CEF3 is based on a multi-process architecture, which may include at least browser processes, rendering processes, and other processes. The browser process is a process directly acting with the present embodiment, and in this embodiment, the CEF3 threads are threads in the browser process unless otherwise specified. In this embodiment, scene Graph supports two rendering modes, namely a multithreading mode and a base mode. According to different running environments, the Qt Quick can autonomously select a proper rendering mode, and has no sense on upper-layer business. In this embodiment, the Scene Graph rendering is referred to specifically as the multiline Cheng Xuanran mode unless otherwise specified.
Table 2 is a table of relationships between a Main Thread, a CEF3 Thread, and a rendering Thread according to one embodiment of the present disclosure, in which the Main Thread (Main Thread) receives user interaction events and synchronizes the events to the CEF3 Thread, and responds to CEF3 Thread information (Msg) and updates UI states, as shown in Table 2. The CEF3 Thread (CEF 3 Thread) processes Msg and issues (post) to the main Thread as needed, and performs data processing, such as updating of picture frames. A rendering Thread (Render Thread) obtains picture frame data and updates the Scene Graph node. In addition, the dependency relationship among the main thread, the CEF3 thread and the rendering thread is that the main thread and the rendering thread may be in the same thread, but there is no business dependency between the main thread and the rendering thread, and the CEF3 thread processes Msg from the rendering process in real time and supplies the main thread and the rendering thread.
TABLE 2 Main thread, CEF3 thread, and rendering thread relationship Table
For picture frame synchronization, the response in this embodiment includes a picture frame, and other status updates are not mentioned. Since the picture frames do not interact directly with the UI interface, QImage, which is accessible across UI threads, is selected as the data storage format. The QImage memory management mechanism (materialsharing) can simplify the business logic in data synchronization and refresh operations. Wherein, materialsharing can realize memory Sharing and memory reallocation, etc.
Fig. 6 is a flow chart of a simulated browser feature according to one embodiment of the disclosure, which may include a graphical User Interface (UI) 60 and a target component (CEF 3) 62, as shown in fig. 6. The flow of simulating the browser function may include the steps of:
step S601, capturing an initial interaction event.
In the above step S601, an initial interaction event (Qt Quick event) triggered by a User (General User) clicking on a response area using a mouse may be captured.
Step S602, analyzing the initial interaction event parameters.
In the above step S602, after capturing the Qt Quick event, the Qt Quick event parameters acting on the graphical User Interface (UI) 60 may be parsed, that is, event processing may be performed. The target base class (QuickItem) is a base class of the visual control, is used for creating a UI interface, and can provide capabilities of interface rendering, interaction, event processing and the like.
In step S603, the parameters are transferred to the target component.
In step S603, after parsing the Qt Quick event parameters, the relevant parameters may be transferred to the target component (CEF 3) 62.
In step S604, a target interaction event is executed.
In step S604, after receiving the Qt Quick event parameter, the browser process of the CEF3 executes the target interaction event, that is, executes the CEF3 event, and performs event processing.
Step S605, delivering the target interaction event to the rendering process.
In the above step S605, after the browser process executes the CEF3 event, the executed CEF3 event may be transferred to the rendering process of the CEF 3.
Step S606, redraw the frame.
In the above step S606, after the executed CEF3 event is transferred to the rendering process, the rendering process performs the redrawing operation of the screen.
In step S607, the picture frame is returned.
In the above step S607, after the rendering process redraws the screen, the rendering process returns the picture frame to the browser process. The above-mentioned implementation of a single complete process of rendering and updating a picture frame by CEF3 includes event processing, redrawing by a rendering process, returning of the redrawn picture frame, and the like, and multi-process rendering by CEF3 is not within the scope of the present disclosure. The update picture frame is the business logic of the CEF component, and the single frame data may include picture data and picture regions.
In this embodiment, the locking operation is performed when the single frame data is updated in full or partial frames. After performing the locking operation, the full-width update may update the frame buffer with QImage: copy (). The local update may update the designated area of the frame buffer using qpointer:: drawImage (). After the update operation on the single frame data, a release lock operation may be performed.
Step S608, respond to the interface update operation.
Step S609, a request for acquiring a picture frame is sent.
In step S609 described above, a request to acquire a picture frame is sent to the CEF3 in response to an interface refresh operation or an interface redraw operation acting on the UI interface.
In step S610, a picture frame returned by the browser process is acquired.
In the above step S610, after the request for acquiring the picture frame is sent to the target component CEF3, the picture frame returned by the browser process in response to the request for acquiring the picture frame by the target component CEF3 may be acquired.
In step S611, the node is updated.
In the above step S611, the SG node of the UI interface may be updated based on the acquired picture frame. The above-mentioned process of rendering the picture frame by one complete QuickItem is realized, and after the whole process is completed, the UI interface is refreshed, and the process may include updating the frame buffer, texture construction and updating the Scene Graph node data. The update frame buffer corresponds to sending a request for acquiring the picture frame to the CEF3, and acquiring the picture frame returned through the browser process. The frame buffer may be updated by performing a locking operation, then constructing the frame buffer using QImage replication, and finally performing a releasing operation. Since the frame buffer corresponds to the data to be drawn by the current UI, the texture data generated by the frame buffer can update the UI interface, that is, the GPU rendering depends on the texture data, the embodiment needs to construct the texture data. Based on the acquired picture frames, SG node data may be updated. In this embodiment, the overall flow of fig. 6 may achieve the technical effect of simulating the browser function, for example, after the user adds the commodity to the shopping cart in the browser, the shopping cart may be newly added with the commodity, i.e. the browser UI interface is refreshed.
Because the related technical scheme adopts a QWINGET and Quick coexisting structure, the scheme nests Quick in the Qt QWINGET application program, and has a certain improvement in development efficiency and visual effect, but the structure has some unavoidable defects. The present embodiment utilizes the plug-in mechanism of Qt Quick to package CEF3 into a Webview component. The component has the behavior which is highly consistent with that of the Qt Quick native control, and by means of the component, qt Quick developers can integrate the functions of CEF2 into a service module without paying attention to any technical details related to CEF3, so that the technical effect of effectively performing data processing of an application program is achieved, and the technical problem that the data processing of the application program cannot be effectively performed is solved.
CEF3 in this embodiment is based on a multi-process architecture, which may include browser processes, rendering processes, and other processes. At the business layer, the multi-process organization form has two modes of single executable programs and independent subprocesses. The single executable program needs to multiplex the executable files of the Qt Quick application, and service dependence outside the Qt Quick module exists. The independent sub-process may place the creation flow of the rendering process and other processes in additional executable programs. This embodiment employs an independent sub-process mode to facilitate encapsulating the creation of a rendering process or other process inside the CEF3 component.
FIG. 7 is a schematic diagram of relationships between processes and modules according to one embodiment of the present disclosure, as shown in FIG. 7, where a unique browser process exists in the CEF3 application, in this embodiment, the process to which the current Qt Quick application belongs. In a Qt Quick application, there will be multiple Webview windows provided by CEF 3. Each Webview window has one CefClient instance and one browser object (CefBrowser), wherein the application has one and only one application class (CefApp), and the CefClient instance has a plurality of but unique CefApp, cefClient instances can be used for browser window management, in-browser interaction and event processing, and the CefBrowser can be regarded as one browser object (browser window instance) provided by the CEF. Meanwhile, each Webview instance has a corresponding rendering process, and each Webview instance and the corresponding rendering process are in bidirectional communication through inter-process communication (IPC). All Webview instances may also be made to share the rendering process by configuring parameters (Renderer Process Limit).
The OSR mode of CEF3 is used in this embodiment to draw the frames, in which additional processing of browser events is required. Event handling involves cross-process communication and cross-thread interface calls, and related modules are functionally divided into QuickItem, browser process, and rendering process.
FIG. 8 is a schematic diagram of a cross-process communication and interface call that may include a target base class (QuickItem) 801, a browser process 802, and a rendering process 803, as shown in FIG. 8, according to one embodiment of the present disclosure. The target base class 801 module is responsible for format conversion of the Qt Quick event, and distributes the format conversion to three sub-modules, namely an interface CefBrowserHost, cefBrowser and a CefFrame in the browser process 802 module through interface call, and the QuickCefView in the target base class 801 is a CEF3 component based on the Qt Quick technology. Cefbrowser host represents a browser process to which a browser object belongs, and such interfaces can only be invoked in the browser process for some operations of a browser window, such as focusing, window scaling, window hiding, keyboard event and mouse event processing, and the like. The CefBrowser can provide related functions of the browser, such as forward, backward, reloading pages, canceling page loading and the like, and one WebView window has two browser objects which belong to a browser process and a rendering process respectively. But in the rendering process, this type of interface can only be invoked in the main thread. CefFrame represents a visual window, can provide some operations and information of the current page, such as loading the page, acquiring the page address, etc., belongs to a browser object, and invokes the rules as CefBrowser. In summary, after the QuickItem event is converted, the QuickItem event is forwarded to three corresponding sub-modules in the browser process, and CefFrame is communicated across processes through an inter-process communication (IPC) mechanism.
In this embodiment, the target component CEF3 provides three window rendering modes of a sub-window, an independent window and a non-window, where the independent window does not meet the requirement of the nested layout. The sub-window needs to be adapted to the size of the current assembly view port in real time, and the flexibility and the fluency are insufficient. The no frame is a frame drawn in OSR mode, which can provide the original frame of the page, but requires additional processing of browser events. This embodiment adopts the OSR rendering mode in view of rendering efficiency and flexibility.
Table 3 is a table of difference between the update flows of QCefView and QuickCefView according to one embodiment of the present disclosure, and as shown in Table 3, the difference between the QCefView and the QuickCefView in terms of timing control and picture redrawing is embodied by comparing the update flows of the two. For the difference of the lock mechanism, QCxView performs the lock operation by using a drawing function (Qpaint:: draw), quickCefView performs the lock operation by using the reference update of QImage, and the lock reference of the QuickCefView can reduce the lock competition relative to the lock mechanism of the QCxView. Aiming at the drawing difference, QCefView is in a UI thread and directly intervenes in drawing (paint) operation, and QuickCefView is in a non-UI thread and only updates SG nodes, so that the rendering mechanism of Qt Quick can fully utilize GPU resources and occupy less CPU time slices.
Table 3 update flow difference Table of QCefView and QuickCefView
In an embodiment of the disclosure, detecting a target interaction event by using a first process of multiple processes, wherein the target interaction event is converted from an initial interaction event acting on a graphical user interface; responding to the target interaction event by utilizing a second process in the multiple processes, and executing image rendering operation on the initial image data of the application program to obtain target image data; and sending the target image data to the graphical user interface through the first process by utilizing the second process so as to update the initial display content of the application program on the graphical user interface into target display content corresponding to the target image data by utilizing the target base class, wherein the target base class is used for creating the graphical user interface, and the initial display content is generated by the initial image data. That is, the present disclosure detects a target interaction event by using a first process in multiple processes, after detecting the target interaction event, performs an image rendering operation on initial image data of an application program by using a second process in multiple processes to obtain target image data, and after obtaining the target image data, updates initial display content of the application program on a graphical user interface to target display content corresponding to the target image data by using a target base class, thereby realizing a technical effect of effectively performing data processing of the application program, and solving a technical problem that data processing of the application program cannot be effectively performed.
From the description of the above embodiments, it will be clear to a person skilled in the art that the method according to the above embodiments may be implemented by means of software plus the necessary general hardware platform, but of course also by means of hardware, but in many cases the former is a preferred embodiment. Based on such understanding, the technical solution of the present disclosure may be embodied essentially or in a part contributing to the related art in the form of a software product stored in a storage medium (such as ROM/RAM, magnetic disk, optical disk), including several instructions for causing a terminal device (which may be a mobile phone, a computer, a server, or a network device, etc.) to perform the method of the embodiments of the present disclosure.
The embodiment also provides a data processing device for an application program, which is used for implementing the foregoing embodiments and preferred embodiments, and is not described in detail. As used below, the term "unit" may be a combination of software and/or hardware that implements a predetermined function. While the means described in the following embodiments are preferably implemented in software, implementation in hardware, or a combination of software and hardware, is also possible and contemplated.
FIG. 9 is a block diagram of a data processing apparatus of an application program according to one embodiment of the present disclosure, applied to a target component of the application program, the target component being composed of multiple processes, as shown in FIG. 9, the data processing apparatus 900 of the application program may include: a detection unit 902, a rendering unit 904, and a transmission unit 906.
A detection unit 902, configured to detect a target interaction event by using a first process of the multiple processes, where the target interaction event is converted from an initial interaction event acting on the graphical user interface.
And a rendering unit 904 for performing an image rendering operation on the initial image data of the application program to obtain target image data in response to the target interaction event by using a second process of the multiple processes.
And a sending unit 906, configured to send the target image data to the graphical user interface through the first process by using the second process, so as to update the initial display content of the application program on the graphical user interface to the target display content corresponding to the target image data by using the target base class, where the target base class is used to create the graphical user interface, and the initial display content is generated by using the initial image data.
Optionally, the detection unit 902 includes: the detection module is used for detecting the target format of the converted initial interaction event by utilizing the first process; and the first determining module is used for determining the converted initial interaction event as a target interaction event in response to the target format being the format matched with the target component.
Optionally, the rendering unit 904 includes: the second determining module is used for determining a data range to be updated in the initial image data by utilizing a second process to respond to the target interaction event; and the execution module is used for executing image rendering operation on the image data in the data range in the initial image data by utilizing the second process to obtain target image data.
Optionally, the rendering unit 904 includes: and the rendering module is used for responding to the target interaction event by utilizing the second process, and executing image rendering operation in the off-screen rendering mode on the initial image data to obtain target image data.
Optionally, the rendering module includes: the drawing submodule is used for responding to the target interaction event by utilizing the second process and drawing the target viewport in the off-screen rendering mode; and the execution sub-module is used for executing image rendering operation on the initial image data in the target viewport by utilizing the second process to obtain target image data.
Optionally, the apparatus further comprises: the calling unit is used for calling the visual target instance in the rendering frame by utilizing the first process; the first determining unit is used for determining a target object corresponding to the visual target instance in the first process; and the transmission unit is used for calling a target interface corresponding to the target object in the rendering frame and transmitting the target interaction event to the second process.
Optionally, the apparatus further comprises: and the second determining unit is used for determining a process corresponding to the visualization target instance in the multiple processes as a second process.
In this embodiment, the target interaction event is detected by the detection unit 902 using a first process of the multiple processes, wherein the target interaction event is converted from an initial interaction event acting on the graphical user interface. The image rendering operation is performed on the initial image data of the application program by the rendering unit 904 in response to the target interaction event using the second process of the multi-process, resulting in target image data. The sending unit 906 sends the target image data to the graphical user interface through the first process by using the second process, so as to update the initial display content of the application program on the graphical user interface to the target display content corresponding to the target image data by using the target base class, wherein the target base class is used for creating the graphical user interface, and the initial display content is generated by the initial image data. That is, the present disclosure detects a target interaction event by using a first process in multiple processes, after detecting the target interaction event, performs an image rendering operation on initial image data of an application program by using a second process in multiple processes to obtain target image data, and after obtaining the target image data, updates initial display content of the application program on a graphical user interface to target display content corresponding to the target image data by using a target base class, thereby realizing a technical effect of effectively performing data processing of the application program, and solving a technical problem that data processing of the application program cannot be effectively performed.
Fig. 10 is a block diagram of a data processing apparatus of another application program according to an embodiment of the present disclosure, a graphical user interface is provided through a terminal device, the graphical user interface being created based on a target base class, and as shown in fig. 10, the data processing apparatus 1000 of the application program may include: an acquisition unit 1002, a conversion unit 1004, and an update unit 1006.
An acquisition unit 1002, configured to acquire an initial interaction event acting on the graphical user interface.
The converting unit 1004 is configured to convert the initial interaction event into a target interaction event that matches a target component of the application program, where the target component is configured by multiple processes, the target interaction event is detected by a first process of the multiple processes, and an image rendering operation is performed on the initial image data of the application program by a second process of the multiple processes to obtain target image data.
And an updating unit 1006, configured to update, by using a target base class, initial display content of the application program on the gui to target display content corresponding to the target image data, where the target base class is used to create the gui, and the initial display content is generated by the initial image data.
Optionally, the conversion unit 1004 includes: and the conversion module is used for converting the initial format of the initial interaction event into a format matched with the target component so as to obtain the target interaction event.
Optionally, the apparatus further comprises: a transmitting unit configured to transmit a target request in response to an interface update operation acting on a graphical user interface, wherein the target request is used to request acquisition of target image data from a target component; and the acquisition unit is used for acquiring target image data returned by the target component through the first process in response to the target request.
In this embodiment, an initial interaction event acting on the graphical user interface is acquired by the acquisition unit 1002. The initial interaction event is converted into a target interaction event matched with a target component of the application program by the conversion unit 1004, wherein the target component is composed of multiple processes, the target interaction event is detected by a first process of the multiple processes, and an image rendering operation is performed on the initial image data of the application program by a second process of the multiple processes to obtain target image data. The updating unit 1006 updates the initial display content of the application program on the gui to the target display content corresponding to the target image data by using the target base class, where the target base class is used to create the gui, and the initial display content is generated by the initial image data. That is, the present disclosure converts the obtained initial interaction event acting on the gui into a target interaction event matched with the target component of the application program, so as to obtain target image data, and further uses the target base class to update the initial display content of the application program on the gui into the target display content corresponding to the target image data, thereby realizing the technical effect of effectively performing data processing on the application program, and solving the technical problem that the data processing on the application program cannot be effectively performed.
It should be noted that each of the above units may be implemented by software or hardware, and for the latter, it may be implemented by, but not limited to: the units are all located in the same processor; alternatively, the units described above may be located in different processors, respectively, in any combination.
Embodiments of the present disclosure also provide a computer readable storage medium having a computer program stored therein, wherein the computer program is arranged to perform the steps of any of the method embodiments described above when run.
Alternatively, in the present embodiment, the above-described computer-readable storage medium may be configured to store a computer program for performing the steps of:
s1, detecting a target interaction event by using a first process in multiple processes, wherein the target interaction event is obtained by converting an initial interaction event acted on a graphical user interface;
s2, responding to the target interaction event by utilizing a second process in the multiple processes, and executing image rendering operation on the initial image data of the application program to obtain target image data;
and S3, transmitting the target image data to the graphical user interface through the first process by utilizing the second process so as to update the initial display content of the application program on the graphical user interface into the target display content corresponding to the target image data by utilizing the target base class, wherein the target base class is used for creating the graphical user interface, and the initial display content is generated by the initial image data.
Alternatively, in the present embodiment, the above-described computer-readable storage medium may include, but is not limited to: a usb disk, a Read-Only Memory (ROM), a random access Memory (Random Access Memory, RAM), a removable hard disk, a magnetic disk, or an optical disk, or other various media capable of storing a computer program.
Embodiments of the present disclosure also provide an electronic device comprising a memory having stored therein a computer program and a processor arranged to run the computer program to perform the steps of any of the method embodiments described above.
Optionally, the electronic apparatus may further include a transmission device and an input/output device, where the transmission device is connected to the processor, and the input/output device is connected to the processor.
Alternatively, in the present embodiment, the above-described processor may be configured to execute the following steps by a computer program:
s1, detecting a target interaction event by using a first process in multiple processes, wherein the target interaction event is obtained by converting an initial interaction event acted on a graphical user interface;
s2, responding to the target interaction event by utilizing a second process in the multiple processes, and executing image rendering operation on the initial image data of the application program to obtain target image data;
And S3, transmitting the target image data to the graphical user interface through the first process by utilizing the second process so as to update the initial display content of the application program on the graphical user interface into the target display content corresponding to the target image data by utilizing the target base class, wherein the target base class is used for creating the graphical user interface, and the initial display content is generated by the initial image data.
Alternatively, specific examples in this embodiment may refer to examples described in the foregoing embodiments and optional implementations, and this embodiment is not described herein.
Fig. 11 is a schematic diagram of an electronic device according to an embodiment of the disclosure. As shown in fig. 11, the electronic device 1100 is merely an example, and should not be construed as limiting the functionality and scope of use of the disclosed embodiments.
As shown in fig. 11, the electronic apparatus 1100 is embodied in the form of a general purpose computing device. Components of the electronic device 1100 may include, but are not limited to: the at least one processor 1110, the at least one memory 1120, a bus 1130 connecting the various system components including the memory 1120 and the processor 1110, and a display 1140.
Wherein the memory 1120 stores program code that can be executed by the processor 1110 to cause the processor 1110 to perform steps according to various exemplary implementations of the present disclosure described in the above method section of the embodiments of the present disclosure.
The memory 1120 may include a readable medium in the form of a volatile memory unit, such as a Random Access Memory (RAM) 11201 and/or a cache memory 11202, and may further include a Read Only Memory (ROM) 11203, and may also include non-volatile memory, such as one or more magnetic storage devices, flash memory, or other non-volatile solid state memory.
In some examples, memory 1120 may also include program/utility 11204 having a set (at least one) of program modules 11205, such program modules 11205 including, but not limited to: an operating system, one or more application programs, other program modules, and program data, each or some combination of which may include an implementation of a network environment. The memory 1120 may further include memory remotely located relative to the processor 1110, which may be connected to the electronic device 1100 via a network. Examples of such networks include, but are not limited to, the internet, intranets, local area networks, mobile communication networks, and combinations thereof.
Bus 1130 may be a local bus representing one or more of several types of bus structures, including a memory unit bus or memory unit controller, a peripheral bus, an accelerated graphics port, and processor 1110 using any of a variety of bus architectures.
Display 1140 may be, for example, a touch screen type Liquid Crystal Display (LCD) that may enable a user to interact with a user interface of electronic device 1100.
Optionally, the electronic apparatus 1100 may also communicate with one or more external devices 1400 (e.g., keyboard, pointing device, bluetooth device, etc.), one or more devices that enable a user to interact with the electronic apparatus 1100, and/or any device (e.g., router, modem, etc.) that enables the electronic apparatus 1100 to communicate with one or more other computing devices. Such communication may occur through an input/output (I/O) interface 1150. Also, electronic device 1100 can communicate with one or more networks such as a Local Area Network (LAN), a Wide Area Network (WAN), and/or a public network, such as the Internet, through network adapter 1160. As shown in fig. 11, the network adapter 1160 communicates with other modules of the electronic device 1100 via the bus 1130. It should be appreciated that although not shown in fig. 11, other hardware and/or software modules may be used in connection with the electronic device 1100, which may include, but are not limited to: microcode, device drivers, redundant processing units, external disk drive arrays, redundant array of independent disks (Redundant Array of Independent Disk, simply RAID) systems, tape drives, data backup storage systems, and the like.
The electronic device 1100 may further include: a keyboard, a cursor control device (e.g., a mouse), an input/output interface (I/O interface), a network interface, a power supply, and/or a camera.
It will be appreciated by those of ordinary skill in the art that the configuration shown in fig. 11 is merely illustrative and is not intended to limit the configuration of the electronic device described above. For example, the electronic device 1100 may also include more or fewer components than shown in fig. 11, or have a different configuration than shown in fig. 1. The memory 1120 may be used to store a computer program and corresponding data, such as a computer program and corresponding data corresponding to an information processing method of a model in an embodiment of the present disclosure. The processor 1110 executes a computer program stored in the memory 1120 to perform various functional applications and data processing, i.e., an information processing method implementing the above-described model.
The foregoing embodiment numbers of the present disclosure are merely for description and do not represent advantages or disadvantages of the embodiments.
In the foregoing embodiments of the present disclosure, the descriptions of the various embodiments are emphasized, and for a portion of this disclosure that is not described in detail in this embodiment, reference is made to the related descriptions of other embodiments.
In the several embodiments provided in the present disclosure, it should be understood that the disclosed technology content may be implemented in other manners. The above-described embodiments of the apparatus are merely exemplary, and the division of units may be a logic function division, and there may be another division manner in actual implementation, for example, multiple units or components may be combined or integrated into another system, or some features may be omitted, or not performed. Alternatively, the coupling or direct coupling or communication connection shown or discussed with each other may be through some interfaces, units or modules, or may be in electrical or other forms.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed over a plurality of units. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of this embodiment.
In addition, each functional unit in each embodiment of the present disclosure may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit. The integrated units may be implemented in hardware or in software functional units.
The integrated units, if implemented in the form of software functional units and sold or used as stand-alone products, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present disclosure may be essentially or a part contributing to the related art or all or part of the technical solution may be embodied in the form of a software product stored in a storage medium, including several instructions to cause a computer device (which may be a personal computer, a server, or a network device, etc.) to perform all or part of the steps of the methods of the embodiments of the present disclosure. And the aforementioned storage medium includes: a U-disk, a Read-Only Memory (ROM), a random access Memory (RAM, random Access Memory), a removable hard disk, a magnetic disk, or an optical disk, or other various media capable of storing program codes.
The foregoing is merely a preferred embodiment of the present disclosure and it should be noted that modifications and adaptations to those skilled in the art may be made without departing from the principles of the present disclosure and are intended to be comprehended within the scope of the present disclosure.
Claims (15)
1. A data processing method of an application program, characterized by being applied to a target component of the application program, the target component being constituted by a plurality of processes, the method comprising:
detecting a target interaction event by using a first process in the multiple processes, wherein the target interaction event is obtained by converting an initial interaction event acted on a graphical user interface;
responding to the target interaction event by utilizing a second process in the multiple processes, and executing image rendering operation on the initial image data of the application program to obtain target image data;
and sending the target image data to the graphical user interface through the first process by utilizing the second process so as to update initial display content of the application program on the graphical user interface into target display content corresponding to the target image data by utilizing a target base class, wherein the target base class is used for creating the graphical user interface, and the initial display content is generated by the initial image data.
2. The method of claim 1, wherein detecting a target interaction event with a first process of the multiple processes comprises:
detecting a target format of the converted initial interaction event by using the first process;
and responding to the target format being the format matched with the target component, and determining the converted initial interaction event as the target interaction event.
3. The method of claim 1, wherein performing an image rendering operation on the initial image data of the application program with a second process of the plurality of processes in response to the target interaction event to obtain target image data, comprises:
determining a data range to be updated in the initial image data by using the second process to respond to the target interaction event;
and executing image rendering operation on the image data in the data range in the initial image data by using the second process to obtain the target image data.
4. The method of claim 1, wherein performing an image rendering operation on the initial image data of the application program with a second process of the plurality of processes in response to the target interaction event to obtain target image data, comprises:
And responding to the target interaction event by using the second process, and executing image rendering operation in an off-screen rendering mode on the initial image data to obtain the target image data.
5. The method of claim 4, wherein performing an image rendering operation in an off-screen rendering mode on the initial image data with the second process in response to the target interaction event, results in the target image data, comprising:
responding to the target interaction event by using the second process, and drawing a target viewport in the off-screen rendering mode;
and executing image rendering operation on the initial image data in the target viewport by using the second process to obtain the target image data.
6. The method according to claim 1, wherein the method further comprises:
invoking a visual target instance in a rendering framework by using the first process;
determining a target object corresponding to the visual target instance in the first process;
and calling a target interface corresponding to the target object in the rendering frame, and transmitting the target interaction event to the second process.
7. The method of claim 6, wherein the method further comprises:
and determining a process corresponding to the visualization target instance in the multiple processes as the second process.
8. The method according to any one of claims 1 to 7, further comprising:
and determining the process of the application program in the multiple processes as the first process.
9. A data processing method of an application program, wherein a graphical user interface is provided through a terminal device, the graphical user interface being created based on a target base class, the method comprising:
acquiring an initial interaction event acting on the graphical user interface;
converting the initial interaction event into a target interaction event matched with a target component of an application program, wherein the target component is formed by multiple processes, the target interaction event is detected by a first process in the multiple processes, and image rendering operation is performed on initial image data of the application program by a second process in the multiple processes in response to the initial image data so as to obtain target image data;
and updating initial display content of the application program on the graphical user interface into target display content corresponding to the target image data by utilizing the target base class, wherein the target base class is used for creating the graphical user interface, and the initial display content is generated by the initial image data.
10. The method of claim 9, wherein converting the initial interaction event to a target interaction event that matches a target component of the application comprises:
and converting the initial format of the initial interaction event into a format matched with the target component to obtain the target interaction event.
11. The method according to claim 9, wherein the method further comprises:
transmitting a target request in response to an interface update operation acting on the graphical user interface, wherein the target request is used for requesting the target component to acquire the target image data;
and acquiring the target image data returned by the target component through the first process in response to the target request.
12. A data processing apparatus for an application program, characterized by being applied to a target component of the application program, the target component being constituted by a plurality of processes, the apparatus comprising:
the detection unit is used for detecting a target interaction event by utilizing a first process in the multiple processes, wherein the target interaction event is obtained by converting an initial interaction event acted on a graphical user interface;
The rendering unit is used for responding to the target interaction event by utilizing a second process in the multiple processes, and performing image rendering operation on the initial image data of the application program to obtain target image data;
and the sending unit is used for sending the target image data to the graphical user interface through the first process by utilizing the second process so as to update the initial display content of the application program on the graphical user interface into target display content corresponding to the target image data by utilizing a target base class, wherein the target base class is used for creating the graphical user interface, and the initial display content is generated by the initial image data.
13. A data processing apparatus for an application program, characterized in that a graphical user interface is provided by a terminal device, the graphical user interface being created based on a target base class, the apparatus comprising:
an acquisition unit for acquiring an initial interaction event acting on the graphical user interface;
the conversion unit is used for converting the initial interaction event into a target interaction event matched with a target component of the application program, wherein the target component is formed by multiple processes, the target interaction event is detected by a first process in the multiple processes, and image rendering operation is performed on initial image data of the application program by a second process in the multiple processes in response to the initial image data of the application program so as to obtain target image data;
And the updating unit is used for updating the initial display content of the application program on the graphical user interface into the target display content corresponding to the target image data by utilizing the target base class, wherein the target base class is used for creating the graphical user interface, and the initial display content is generated by the initial image data.
14. A computer readable storage medium, characterized in that the computer readable storage medium has stored therein a computer program, wherein the computer program is arranged to perform the method of any of the claims 1 to 11 when being run by a processor.
15. An electronic device comprising a memory and a processor, characterized in that the memory has stored therein a computer program, the processor being arranged to run the computer program to perform the method of any of the claims 1 to 11.
Priority Applications (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| CN202311699969.8A CN117707670A (en) | 2023-12-11 | 2023-12-11 | Data processing method and device for application program, storage medium and electronic device |
Applications Claiming Priority (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| CN202311699969.8A CN117707670A (en) | 2023-12-11 | 2023-12-11 | Data processing method and device for application program, storage medium and electronic device |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| CN117707670A true CN117707670A (en) | 2024-03-15 |
Family
ID=90154624
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| CN202311699969.8A Pending CN117707670A (en) | 2023-12-11 | 2023-12-11 | Data processing method and device for application program, storage medium and electronic device |
Country Status (1)
| Country | Link |
|---|---|
| CN (1) | CN117707670A (en) |
-
2023
- 2023-12-11 CN CN202311699969.8A patent/CN117707670A/en active Pending
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| US20210303108A1 (en) | System and method for on-screen graphical user interface encapsulation and reproduction | |
| US11012740B2 (en) | Method, device, and storage medium for displaying a dynamic special effect | |
| CN107992301B (en) | User interface implementation method, client and storage medium | |
| US10353718B2 (en) | Providing access to a remote application via a web client | |
| US9575652B2 (en) | Instantiable gesture objects | |
| Eagan et al. | Cracking the cocoa nut: user interface programming at runtime | |
| Badam et al. | Munin: A peer-to-peer middleware for ubiquitous analytics and visualization spaces | |
| JP2015534145A (en) | User interface control framework for stamping out controls using declarative templates | |
| KR20210094003A (en) | Data processing methods and devices, devices, servers, systems and storage media | |
| CN104380256A (en) | Method, system and executable piece of code for virtualisation of hardware resource associated with computer system | |
| Schwab et al. | Scalable scalable vector graphics: Automatic translation of interactive svgs to a multithread vdom for fast rendering | |
| CN109582308B (en) | Interactive map component dynamic embedding method and system based on XEmbed | |
| US12236908B2 (en) | Screen display method and terminal device | |
| CN114090188A (en) | Method for realizing independent multi-user system based on android system virtualization and application | |
| CN115729399A (en) | Display control method, device, device and storage medium of pointer in window | |
| Shen et al. | ${\rm HCI}^{\wedge} 2$ Framework: A Software Framework for Multimodal Human-Computer Interaction Systems | |
| US10162602B2 (en) | Method for handling user-level events for programming an application | |
| CN117707670A (en) | Data processing method and device for application program, storage medium and electronic device | |
| CN111813404B (en) | Application method, medium and client based on mixed graphic display | |
| CN119376718B (en) | Twin entity behavior control method and device based on building block script | |
| US20230350532A1 (en) | System and method for on-screen graphical user interface encapsulation and application history reproduction | |
| CN115048191B (en) | Method for switching display equipment by fast application and related equipment | |
| Besacier et al. | Toward user interface virtualization: legacy applications and innovative interaction systems | |
| Rahman | Game development with Unity | |
| WO2025017613A1 (en) | System and method for managing one or more cross-platform applications |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| PB01 | Publication | ||
| PB01 | Publication | ||
| SE01 | Entry into force of request for substantive examination | ||
| SE01 | Entry into force of request for substantive examination |