WO2021147482A1 - 一种长焦拍摄的方法及电子设备 - Google Patents
一种长焦拍摄的方法及电子设备 Download PDFInfo
- Publication number
- WO2021147482A1 WO2021147482A1 PCT/CN2020/128986 CN2020128986W WO2021147482A1 WO 2021147482 A1 WO2021147482 A1 WO 2021147482A1 CN 2020128986 W CN2020128986 W CN 2020128986W WO 2021147482 A1 WO2021147482 A1 WO 2021147482A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- preview
- image
- auxiliary
- area
- electronic device
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/67—Focus control based on electronic image sensor signals
- H04N23/675—Focus control based on electronic image sensor signals comprising setting of focusing regions
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/63—Control of cameras or camera modules by using electronic viewfinders
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/63—Control of cameras or camera modules by using electronic viewfinders
- H04N23/631—Graphical user interfaces [GUI] specially adapted for controlling image capture or setting capture parameters
- H04N23/632—Graphical user interfaces [GUI] specially adapted for controlling image capture or setting capture parameters for displaying or modifying preview images prior to image capturing, e.g. variety of image resolutions or capturing parameters
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/69—Control of means for changing angle of the field of view, e.g. optical zoom objectives or electronic zooming
Definitions
- This application relates to the field of electronic technology, and in particular to a method and electronic equipment for telephoto shooting.
- the shooting function has become an important feature of terminal equipment and the main indicator for evaluating the performance of terminal equipment.
- users use mobile phones and other portable terminal devices to shoot, they have the need for ultra-telephoto shooting.
- the shooting picture of the telephoto fixed focus lens is cut around the center point, and the image quality is optimized to obtain a larger multiple telephoto viewfinder.
- the viewfinder range is equivalent to a 1340mm focal length SLR lens, and the field of view (FOV) is only about 0.5 degrees.
- the coverage area of the viewfinder screen is very small. Therefore, ultra-telephoto shooting will zoom in and shake, causing the viewfinder screen to shift; and when the viewfinder screen shifts in a large range, it is difficult to find the shooting target.
- the present application provides a method and electronic device for telephoto shooting.
- the method by simultaneously displaying two viewfinder images, the user can more easily find the subject of the shooting target during the telephoto shooting process, thereby improving the shooting experience.
- a telephoto shooting method which is applied to an electronic device including a lens.
- the method includes: displaying a shooting interface of a camera of the electronic device, the shooting interface including an image preview area, and the image preview area displays the first A preview screen; a first operation is detected, and in response to the first operation, the electronic device simultaneously displays the image preview area and an auxiliary preview window on the shooting interface, and the auxiliary preview window displays a second preview screen; wherein, the first The preview image and the second preview image are obtained through the lens, and the first preview image is the viewfinder image under the first magnification, the second preview image is the viewfinder image under the second magnification, and the first preview The picture is obtained by cutting according to the second preview picture, and the first magnification is greater than or equal to the second magnification.
- this method displays two viewfinder screens at the same time, and displays the original viewfinder screen when "5 ⁇ " is displayed through the auxiliary viewfinder window, in the image preview display area Displays the viewfinder screen after the user adjusts the magnification "50 ⁇ ", and provides the user with a preview screen with two different angles of view.
- the user can display the original viewfinder at "5 ⁇ ” according to the auxiliary viewfinder window. On the screen, it is easier to find the subject of the shooting target without changing the shooting angle of the mobile phone or the lens, which improves the telephoto shooting experience.
- the display of the image preview area and the auxiliary preview window is any one of the following: at least a part of the auxiliary preview window overlaps the image preview area; or The auxiliary preview window is displayed outside the image preview area; or the auxiliary preview window is located in the lower left corner area of the image preview area.
- the auxiliary preview window can have different display modes on the interface of the camera application, can adapt to different shooting scenes, and improve the user's shooting experience.
- the auxiliary preview window and the image preview area do not overlap, so that the user can see a larger preview screen and improve the shooting experience.
- the first operation is an adjustment operation of the user to the magnification of the magnification adjustment area
- the electronic device in response to the first operation, is The displaying of the auxiliary preview window on the shooting interface includes: when it is detected that the first magnification is greater than or equal to a first threshold, the electronic device automatically displays the auxiliary preview window on the shooting interface.
- the user can adjust the magnification of area 30 by sliding the mobile phone magnification.
- the magnification reaches the first threshold (for example, "5 ⁇ "
- the mobile phone automatically switches the lens into the telephoto shooting mode through the telephoto lens of the mobile phone. Get the original viewfinder screen.
- the telephoto shooting mode can also be entered through other methods such as user settings, which is not limited in the embodiment of the present application.
- the first operation is an operation for the user to open the auxiliary preview window in the camera application.
- the camera application may include a switch that will activate the telephoto shooting function—an auxiliary viewfinder switch.
- the auxiliary viewfinder screen switch is set in the top menu area, and the user directly clicks the auxiliary viewfinder screen switch in the top menu area to activate the telephoto shooting function and display the auxiliary viewfinder frame.
- the auxiliary viewfinder screen switch is included in the setting menu, and the user can activate the telephoto shooting function through the setting menu to display the auxiliary viewfinder frame.
- the auxiliary preview window further includes a close button.
- the method further includes: The device detects the user's operation on the close button, and the electronic device closes the auxiliary preview window.
- the user when the user finishes focusing and the subject of the shooting target is displayed in the image preview area, the user can close the auxiliary viewfinder frame, so that a larger image preview area can be provided to facilitate the user to view the image preview screen.
- the auxiliary preview window further includes a target area
- the image of the first preview screen is obtained by processing the image of the target area
- the target area The area is a fixed area in the auxiliary preview window, or the target area is any area in the auxiliary preview window.
- the method further includes: detecting a drag operation for the auxiliary preview window, and in response to the drag operation, the auxiliary preview window starts from the first A position is moved to the second position.
- the user can move the auxiliary framing frame to minimize the occlusion of the auxiliary framing frame on the image displayed in the image preview display area.
- an electronic device including: a lens for acquiring a picture to be photographed; one or more processors; a memory; a plurality of application programs; and one or more programs, wherein the one or more Programs are stored in the memory.
- the electronic device When the one or more programs are executed by the processor, the electronic device is caused to perform the following steps: display the camera's shooting interface, the shooting interface includes an image preview area, the image preview area displays the first A preview screen; a first operation is detected, and in response to the first operation, the image preview area and an auxiliary preview window are displayed on the shooting interface at the same time, and the auxiliary preview window displays a second preview screen; wherein, the first preview screen and The second preview image is obtained through the lens, and the first preview image is a viewfinder image at a first magnification, the second preview image is a viewfinder image at a second magnification, and the first preview image is based on The second preview image is obtained by cutting, and the first magnification is greater than or equal to the second magnification.
- the display of the image preview area and the auxiliary preview window is any one of the following: at least a part of the auxiliary preview window overlaps the image preview area; or The auxiliary preview window is displayed outside the image preview area; or the auxiliary preview window is located in the lower left corner area of the image preview area.
- the first operation is an adjustment operation of the user on the magnification of the magnification adjustment area
- the one or more programs are executed by the processor
- the electronic device When detecting that the first magnification is greater than or equal to the first threshold, the electronic device is caused to perform the following steps: automatically displaying the auxiliary preview window on the shooting interface.
- the first operation is an operation for the user to open the auxiliary preview window in the camera application.
- the auxiliary preview window further includes a close button, and when the one or more programs are executed by the processor, the electronic device is caused to perform the following steps : If the user's operation on the close button is detected, the auxiliary preview window will be closed.
- the auxiliary preview window further includes a target area
- the image of the first preview screen is obtained by processing the image of the target area
- the target area The area is a fixed area in the auxiliary preview window, or the target area is any area in the auxiliary preview window.
- the electronic device when the one or more programs are executed by the processor, the electronic device is caused to perform the following steps: detecting the A drag operation, in response to the drag operation, the auxiliary preview window moves from the first position to the second position.
- the present application provides a device included in an electronic device, and the device has a function of implementing the foregoing aspects and the behavior of the electronic device in the possible implementation manners of the foregoing aspects.
- the function can be realized by hardware, or the corresponding software can be executed by hardware.
- the hardware or software includes one or more modules or units corresponding to the above-mentioned functions. For example, a display module or unit, a detection module or unit, a processing module or unit, and so on.
- the present application provides an electronic device, including: a touch display screen, wherein the touch display screen includes a touch-sensitive surface and a display; a camera; one or more processors; a memory; a plurality of application programs; and one or Multiple computer programs.
- one or more computer programs are stored in the memory, and the one or more computer programs include instructions.
- the electronic device is caused to execute the method for telephoto shooting in any possible implementation of any one of the foregoing aspects.
- this application provides an electronic device including one or more processors and one or more memories.
- the one or more memories are coupled with one or more processors, and the one or more memories are used to store computer program codes.
- the computer program codes include computer instructions.
- the electronic device executes A method of telephoto shooting in any one of the possible implementations of the above-mentioned aspects.
- the present application provides a computer-readable storage medium, including computer instructions, which, when the computer instructions run on an electronic device, cause the electronic device to execute any one of the possible video playback methods in any of the foregoing aspects.
- the present application provides a computer program product, which when the computer program product runs on an electronic device, causes the electronic device to execute any one of the possible methods for telephoto shooting in any of the above-mentioned aspects.
- FIG. 1 is a schematic structural diagram of an example of an electronic device provided by an embodiment of the present application.
- FIG. 2 is a block diagram of the software structure of the electronic device 100 according to an embodiment of the present application.
- FIG. 3 is a schematic diagram of an example of the control structure of the photographing process of an electronic device according to an embodiment of the present application.
- FIG. 4 is a schematic diagram of a graphical user interface of an example of a telephoto shooting process provided by an embodiment of the present application.
- FIG. 5 is a schematic diagram of a graphical user interface of another example of a telephoto shooting process provided by an embodiment of the present application.
- Fig. 6 is an implementation flowchart of an example of a telephoto shooting process provided by an embodiment of the present application.
- FIG. 7 is a schematic diagram of coordinates of an example of a mobile phone interface provided by an embodiment of the present application.
- first and second are only used for descriptive purposes, and cannot be understood as indicating or implying relative importance or implicitly indicating the number of indicated technical features. Therefore, the features defined with “first” and “second” may explicitly or implicitly include one or more of these features, such as the “first preview stream” and “second preview” described in the embodiments of this application. flow”.
- the shooting method provided in the embodiments of this application can be applied to mobile phones, tablet computers, wearable devices, vehicle-mounted devices, augmented reality (AR)/virtual reality (VR) devices, notebook computers, and ultra-mobile personal computers ( Ultra-mobile personal computer (UMPC), netbook, personal digital assistant (personal digital assistant, PDA) and other electronic devices that can implement the shooting function, the embodiment of the present application does not impose any restrictions on the specific types of electronic devices.
- AR augmented reality
- VR virtual reality
- UPC ultra-mobile personal computer
- PDA personal digital assistant
- the electronic device is equipped with a lens, for example, a telephoto lens, which may be a telephoto fixed-focus lens, or a telephoto zoom lens that may be supported in the future.
- a lens for example, a telephoto lens, which may be a telephoto fixed-focus lens, or a telephoto zoom lens that may be supported in the future. This application does not limit the form of the lens.
- FIG. 1 is a schematic structural diagram of an example of an electronic device 100 provided in an embodiment of the present application.
- the electronic device 100 may include a processor 110, an external memory interface 120, an internal memory 121, a universal serial bus (USB) interface 130, a charging management module 140, a power management module 141, a battery 142, an antenna 1, and an antenna 2.
- Mobile communication module 150 wireless communication module 160, audio module 170, speaker 170A, receiver 170B, microphone 170C, earphone jack 170D, sensor module 180, buttons 190, motor 191, indicator 192, camera 193, display screen 194, and Subscriber identification module (subscriber identification module, SIM) card interface 195, etc.
- SIM Subscriber identification module
- the sensor module 180 may include a pressure sensor 180A, a gyroscope sensor 180B, an air pressure sensor 180C, a magnetic sensor 180D, an acceleration sensor 180E, a distance sensor 180F, a proximity sensor 180G, a fingerprint sensor 180H, a temperature sensor 180J, a touch sensor 180K, and ambient light Sensor 180L, bone conduction sensor 180M, etc.
- the structure illustrated in the embodiment of the present application does not constitute a specific limitation on the electronic device 100.
- the electronic device 100 may include more or fewer components than those shown in the figure, or combine certain components, or split certain components, or arrange different components.
- the illustrated components can be implemented in hardware, software, or a combination of software and hardware.
- the processor 110 may include one or more processing units.
- the processor 110 may include an application processor (AP), a modem processor, a graphics processing unit (GPU), and an image signal processor. (image signal processor, ISP), controller, memory, video codec, digital signal processor (digital signal processor, DSP), baseband processor, and/or neural-network processing unit (NPU) Wait.
- AP application processor
- modem processor modem processor
- GPU graphics processing unit
- image signal processor image signal processor
- ISP image signal processor
- controller memory
- video codec digital signal processor
- DSP digital signal processor
- NPU neural-network processing unit
- the different processing units may be independent devices or integrated in one or more processors.
- the controller may be the nerve center and command center of the electronic device 100.
- the controller can generate operation control signals according to the instruction operation code and timing signals to complete the control of fetching instructions and executing instructions.
- a memory may also be provided in the processor 110 to store instructions and data.
- the processor 110 may include one or more interfaces.
- the interface may include an integrated circuit (inter-integrated circuit, I2C) interface, an integrated circuit built-in audio (inter-integrated circuit sound, I2S) interface, a pulse code modulation (pulse code modulation, PCM) interface, a universal asynchronous transceiver receiver/transmitter, UART) interface, mobile industry processor interface (MIPI), general-purpose input/output (GPIO) interface, subscriber identity module (SIM) interface, and / Or Universal Serial Bus (USB) interface, etc.
- I2C integrated circuit
- I2S integrated circuit built-in audio
- PCM pulse code modulation
- PCM pulse code modulation
- UART universal asynchronous transceiver receiver/transmitter
- MIPI mobile industry processor interface
- GPIO general-purpose input/output
- SIM subscriber identity module
- USB Universal Serial Bus
- the I2C interface is a bidirectional synchronous serial bus.
- the processor 110 can couple the touch sensor 180K through the I2C interface, so that the processor 110 and the touch sensor 180K communicate through the I2C bus interface to realize the touch function of the electronic device 100.
- the MIPI interface can be used to connect the processor 110 with the display screen 194, the camera 193 and other peripheral devices.
- the MIPI interface includes a camera serial interface (camera serial interface, CSI), a display serial interface (display serial interface, DSI), and so on.
- the processor 110 and the camera 193 communicate through a CSI interface to implement the shooting function of the electronic device 100.
- the processor 110 and the display screen 194 communicate through a DSI interface to realize the display function of the electronic device 100.
- the interface connection relationship between the modules illustrated in the embodiment of the present application is merely a schematic description, and does not constitute a structural limitation of the electronic device 100.
- the electronic device 100 may also adopt different interface connection modes in the foregoing embodiments, or a combination of multiple interface connection modes.
- the charging management module 140 is used to receive charging input from the charger.
- the charger can be a wireless charger or a wired charger.
- the power management module 141 is used to connect the battery 142, the charging management module 140 and the processor 110.
- the power management module 141 receives input from the battery 142 and/or the charging management module 140, and supplies power to the processor 110, the internal memory 121, the external memory, the display screen 194, the camera 193, and the wireless communication module 160.
- the electronic device 100 implements a display function through a GPU, a display screen 194, an application processor, and the like.
- the GPU is a microprocessor for image processing, connected to the display 194 and the application processor.
- the GPU is used to perform mathematical and geometric calculations and is used for graphics rendering.
- the processor 110 may include one or more GPUs that execute program instructions to generate or change display information.
- the display screen 194 is used to display images, videos, etc.
- the display screen 194 includes a display panel.
- the display panel can use liquid crystal display (LCD), organic light-emitting diode (OLED), active matrix organic light-emitting diode or active-matrix organic light-emitting diode (active-matrix organic light-emitting diode).
- LCD liquid crystal display
- OLED organic light-emitting diode
- active-matrix organic light-emitting diode active-matrix organic light-emitting diode
- AMOLED flexible light-emitting diode (FLED), Miniled, MicroLed, Micro-oLed, quantum dot light-emitting diode (QLED), etc.
- the electronic device 100 may include one or N display screens 194, and N is a positive integer greater than one.
- the electronic device 100 can realize a shooting function through an ISP, a camera 193, a touch sensor, a video codec, a GPU, a display screen 194, and an application processor. For example, the telephoto shooting process described in the embodiment of this application.
- the ISP is used to process the data fed back by the camera 193.
- the shutter is opened, the light is transmitted to the photosensitive element of the camera through the lens, the light signal is converted into an electrical signal, and the photosensitive element of the camera transmits the electrical signal to the ISP for processing and is converted into an image visible to the naked eye.
- ISP can also optimize the image noise, brightness, and skin color.
- ISP can also optimize the exposure, color temperature and other parameters of the shooting scene.
- the ISP may be provided in the camera 193.
- the camera 193 is used to capture still images or videos.
- the object generates an optical image through the lens and is projected to the photosensitive element.
- the photosensitive element may be a charge coupled device (CCD) or a complementary metal-oxide-semiconductor (CMOS) phototransistor.
- CMOS complementary metal-oxide-semiconductor
- the photosensitive element converts the optical signal into an electrical signal, and then transmits the electrical signal to the ISP to convert it into a digital image signal.
- ISP outputs digital image signals to DSP for processing.
- the DSP converts digital image signals into standard image signals in RGB, YUV and other formats. It should be understood that in the description of the embodiments of this application, an image in RGB format is used as an example for introduction, and the embodiment of this application does not limit the image format.
- the electronic device 100 may include one or N cameras 193, and N is a positive integer greater than one.
- Digital signal processors are used to process digital signals. In addition to digital image signals, they can also process other digital signals. For example, when the electronic device 100 selects the frequency point, the digital signal processor is used to perform Fourier transform on the energy of the frequency point.
- Video codecs are used to compress or decompress digital video.
- the electronic device 100 may support one or more video codecs. In this way, the electronic device 100 can play or record videos in multiple encoding formats, such as: moving picture experts group (MPEG) 1, MPEG2, MPEG3, MPEG4, and so on.
- MPEG moving picture experts group
- MPEG2 MPEG2, MPEG3, MPEG4, and so on.
- the external memory interface 120 may be used to connect an external memory card, such as a Micro SD card, to expand the storage capacity of the electronic device 100.
- the internal memory 121 may be used to store computer executable program code, where the executable program code includes instructions.
- the processor 110 executes various functional applications and data processing of the electronic device 100 by running instructions stored in the internal memory 121.
- the internal memory 121 may include a storage program area and a storage data area.
- the electronic device 100 can implement audio functions through the audio module 170, the speaker 170A, the receiver 170B, the microphone 170C, the earphone interface 170D, and the application processor. For example, music playback, recording, etc.
- the pressure sensor 180A is used to sense the pressure signal and can convert the pressure signal into an electrical signal.
- the gyro sensor 180B may be used to determine the movement posture of the electronic device 100.
- the air pressure sensor 180C is used to measure air pressure.
- the magnetic sensor 180D includes a Hall sensor.
- the electronic device 100 may use the magnetic sensor 180D to detect the opening and closing of the flip holster.
- the acceleration sensor 180E can detect the magnitude of the acceleration of the electronic device 100 in various directions (generally three axes).
- Distance sensor 180F used to measure distance.
- the electronic device 100 can measure the distance by infrared or laser. In some embodiments, when shooting a scene, the electronic device 100 may use the distance sensor 180F to measure the distance to achieve fast focusing.
- the proximity light sensor 180G may include, for example, a light emitting diode (LED) and a light detector such as a photodiode.
- the ambient light sensor 180L is used to perceive the brightness of the ambient light.
- the electronic device 100 can adaptively adjust the brightness of the display screen 194 according to the perceived brightness of the ambient light.
- the ambient light sensor 180L can also be used to automatically adjust the white balance when taking pictures.
- the fingerprint sensor 180H is used to collect fingerprints.
- the electronic device 100 can use the collected fingerprint characteristics to implement fingerprint unlocking, access application locks, fingerprint photographs, fingerprint answering calls, and so on.
- the temperature sensor 180J is used to detect temperature. Touch sensor 180K, also called "touch panel".
- the touch sensor 180K may be disposed on the display screen 194, and the touch screen is composed of the touch sensor 180K and the display screen 194, which is also called a “touch screen”.
- the touch sensor 180K is used to detect touch operations acting on or near it.
- the bone conduction sensor 180M can acquire vibration signals.
- the audio module 170 can parse the voice signal based on the vibration signal of the vibrating bone block of the voice obtained by the bone conduction sensor 180M, and realize the voice function.
- the button 190 includes a power-on button, a volume button, and so on.
- the button 190 may be a mechanical button. It can also be a touch button.
- the electronic device 100 may receive key input, and generate key signal input related to user settings and function control of the electronic device 100.
- the motor 191 can generate vibration prompts.
- the motor 191 can be used for incoming call vibration notification, and can also be used for touch vibration feedback. For example, touch operations applied to different applications (such as photographing, audio playback, etc.) can correspond to different vibration feedback effects. Acting on touch operations in different areas of the display screen 194, the motor 191 can also correspond to different vibration feedback effects.
- the indicator 192 may be an indicator light, which may be used to indicate the charging status, power change, or to indicate messages, missed calls, notifications, and so on.
- the SIM card interface 195 is used to connect to the SIM card.
- the software system of the electronic device 100 may adopt a layered architecture, an event-driven architecture, a microkernel architecture, a microservice architecture, or a cloud architecture.
- the embodiment of the present application takes an Android system with a layered architecture as an example to illustrate the software structure of the electronic device 100 by way of example.
- FIG. 2 is a block diagram of the software structure of the electronic device 100 according to an embodiment of the present application.
- the layered architecture divides the software into several layers, and each layer has a clear role and division of labor. Communication between layers through software interface.
- the Android system is divided into four layers. From top to bottom, they are the application layer, the application framework layer, the Android runtime and system libraries, and the hardware abstract layer (HAL). ).
- the application layer can include a series of application packages. As shown in Figure 2, the application package may include applications such as camera, photo album, music, settings, etc.
- the application framework layer provides an application programming interface (application programming interface, API) and a programming framework for applications in the application layer.
- the application framework layer includes some predefined functions. As shown in Figure 2, the application framework layer can include a window manager, a content provider, a view system, a resource manager, a notification manager, and so on.
- the window manager is used to manage window programs.
- the window manager can obtain the size of the display screen, determine whether there is a status bar, lock the screen, take a screenshot, etc.
- the content provider is used to store and retrieve data and make these data accessible to applications.
- the data may include videos, images, audios, phone calls made and received, browsing history and bookmarks, phone book, etc.
- the view system includes visual controls, such as controls that display text, controls that display pictures, and so on.
- the view system can be used to build applications.
- the display interface can be composed of one or more views.
- a display interface that includes a short message notification icon may include a view that displays text and a view that displays pictures.
- the resource manager provides various resources for the application, such as localized strings, icons, pictures, layout files, video files, and so on.
- the notification manager enables the application to display notification information in the status bar, which can be used to convey notification-type messages.
- the notification information displayed in the status bar can disappear automatically after a short stay, such as a message reminder to notify the user that the download is complete.
- the notification manager can also be a notification that appears in the status bar at the top of the system in the form of a chart or a scroll bar text, such as a notification of an application running in the background, or a notification that appears on the screen in the form of a dialog window. For example, text messages are prompted in the status bar, or the notification manager can also emit prompt sounds, such as electronic device vibrations, flashing lights, and so on.
- Android runtime includes core libraries and virtual machines. Android runtime is responsible for the scheduling and management of the Android system.
- the core library consists of two parts: one part is the function functions that the java language needs to call, and the other part is the core library of Android.
- the application layer and application framework layer run in a virtual machine.
- the virtual machine executes the java files of the application layer and the application framework layer as binary files.
- the virtual machine is used to perform functions such as object life cycle management, stack management, thread management, security and exception management, and garbage collection.
- the system library can include multiple functional modules. For example: surface manager (surface manager), media library (media libraries), 3D graphics processing library (for example: OpenGL ES), 2D graphics engine (for example: SGL), etc.
- surface manager surface manager
- media library media libraries
- 3D graphics processing library for example: OpenGL ES
- 2D graphics engine for example: SGL
- the surface manager is used to manage the display subsystem and provides a combination of 2D and 3D layers for multiple applications.
- the media library supports playback and recording of a variety of commonly used audio and video formats, as well as still image files.
- the media library can support multiple audio and video encoding formats, such as: MPEG4, H.264, MP3, AAC, AMR, JPG, PNG, etc.
- the 3D graphics processing library is used to implement 3D graphics drawing, image rendering, synthesis, and layer processing.
- the 2D graphics engine is a drawing engine for 2D drawing.
- the HAL can be a part of the kernel layer, or the HAL is an independent layer, located between the kernel layer and the system library, and is a layer between hardware and software.
- HAL can include hardware drive modules, such as display drive, camera drive, sensor drive, etc.
- the application framework layer can call the hardware drive module of HAL.
- the user opens the camera application, the camera application at the application layer in Figure 2 starts, and sends instructions to the HAL to mobilize the HAL's camera driver, sensor driver, and display driver to make the electronics
- the device can start the camera or lens to collect images.
- light is transmitted to the image sensor through the camera, and the image sensor performs photoelectric conversion of the light signal and converts it into an image visible to the naked eye of the user.
- the output image data is transferred to the system library in Figure 2 in the form of a data stream.
- the three-dimensional graphics processing library and the image processing library implement drawing, image rendering, synthesis and layer processing, etc., to generate the display layer; the surface manager will The display layer undergoes fusion processing, etc., and is passed to the content provider, window manager and view system of the application framework layer to control the display of the display interface. Finally, the preview image is displayed on the image preview area of the camera application or on the display screen of the electronic device.
- FIG. 3 is a schematic diagram of an example of the control structure of the photographing process of an electronic device.
- the control structure 300 includes a lens 310, an image sensor 320, and an image signal processing (ISP) module 330.
- ISP image signal processing
- the lens 310 may correspond to the camera 193 of the electronic device 100 in FIG. 1 for acquiring images.
- the camera 193 may be embodied as one or more different lenses, such as a wide-angle lens, a main lens, a telephoto lens, a time of flight (TOF) lens, etc.
- the form and quantity of the lens are different in the embodiment of the application. Make a limit.
- the electronic device having a telephoto fixed focus lens is taken as an example to introduce the process of the electronic device taking a telephoto photo through the telephoto fixed focus lens.
- the image sensor 320 is a semiconductor chip with hundreds of thousands to millions of photodiodes on its surface. When irradiated by light, charges are generated, which are converted into digital signals by an analog-to-digital converter.
- the image sensor 320 may be a charge coupled device (CCD) or a complementary metal-oxide conductor device (CMOS).
- CMOS complementary metal-oxide conductor device
- the CCD can be made of a high-sensitivity semiconductor material, which can convert light into electric charge, which is converted into a digital signal through an analog-to-digital converter chip.
- CCD is composed of many photosensitive units, usually in megapixels. When the surface of the CCD is illuminated by light, each photosensitive unit will reflect the charge on the component.
- the image sensor 320 may also be referred to as a photosensitive chip, a photosensitive element, or the like.
- the ISP module 330 can process the original image, optimize the digital image signal through a series of complex mathematical algorithm operations, and finally transmit the processed signal to the display of the electronic device to display the image.
- the ISP module 330 may be correspondingly embodied as a separate image processing chip or a digital signal processing chip (digital signal processing, DSP), or the ISP module 330 may be a functional module corresponding to the processor 110 in the electronic device 100 shown in FIG. 1 , May include a logic part and a firmware program (firmware) running on it, so that the data obtained by the photosensitive chip can be promptly and quickly transferred to the processor 110 and the photosensitive chip is refreshed.
- the ISP module 330 can also perform algorithm optimization on the noise, brightness, and skin color of the image, and optimize parameters such as exposure parameters and color temperature of the shooting scene.
- the user turns on the camera, and the light is transmitted to the image sensor 320 through the lens 310.
- the lens 310 can project the ambient light signal to the photosensitive area of the image sensor 320, and the image sensor 320 undergoes photoelectric conversion. Transform into an image visible to the naked eye.
- the internal original image (Bayer format) is sent to the ISP module 330, and the ISP module 330 is processed by the algorithm, and outputs the image in the RGB space domain to the back-end acquisition unit, which is displayed on the image preview area of the electronic device or on the display screen of the electronic device.
- the processor 110 controls the lens 310, the image sensor 320, and the ISP module 330 through the firmware running on it, thereby completing the image preview or shooting function.
- FIG. 4 is a schematic diagram of a graphical user interface (GUI) of an example of a telephoto shooting process provided by an embodiment of this application.
- GUI graphical user interface
- This application will take a mobile phone as an example to introduce the telephoto shooting method provided by this application in detail.
- (a) in FIG. 4 shows that in the unlocking mode of the mobile phone, the screen display system of the mobile phone displays the currently output interface content 401, which is the main interface of the mobile phone.
- the interface content 401 displays a variety of applications (applications, App), such as photo albums, settings, music, cameras and other applications. It should be understood that the interface content 401 may also include other more application programs, which is not limited in this application.
- the preview interface 402 of the camera application includes multiple menu areas, and each area includes different keys, which are respectively used to implement various functions of the camera application.
- the top menu area 10 the magnification adjustment area 30, the image preview display area 40, and the bottom menu area of the preview interface 402 of the camera application
- the embodiment does not limit the division method of each area. Each area may be adjacent, separated or overlapped. For the convenience of description, in the embodiment of the present application, the area of the preview interface 402 of the camera application is divided as shown in the dashed box.
- the top menu area 10 includes multiple buttons to meet different shooting needs of users, such as flash buttons, artificial intelligence (AI) recognition buttons, etc., which will not be described here.
- the magnification adjustment area 30 is used to display the magnification of the shooting process, and the user can slide the magnification adjustment area 30 to enlarge the magnification, so as to change the preview image for the shooting target by changing the shooting focal length of the mobile phone or the shooting lens.
- “1 ⁇ ” displayed in Figure (b) means that the preview image is obtained through the main lens of the mobile phone (such as the focal length of 27 mm), and “0.7 ⁇ ” means the preview image is obtained through the wide-angle lens of the mobile phone (such as the focal length of 16 mm), and “5 ⁇ ”Means the preview image is obtained through the mobile phone telephoto lens (such as the focal length of 125 mm), and the magnification above “5 ⁇ ” (such as “25 ⁇ ”, “50 ⁇ ”, etc.) means that the image is currently acquired through digital zoom.
- the preview image acquired by "50 ⁇ ” is obtained from the center of the original framing screen of the telephoto lens of the mobile phone at the time of "5 ⁇ ” after cropping and other processing. Therefore, the framing area of "50 ⁇ ” is only It occupies 1/100 of the viewing area of the mobile phone at "5 ⁇ ".
- the preview interface 402 of the camera application currently displays the view at the time of "1 ⁇ ”: a mountain full of trees.
- the magnification of this mountain is “1 ⁇ ”
- the user's desired shooting target is the tree at the top of the mountain in the dashed frame 20 in Figure (b)
- the magnification of the camera can be adjusted.
- the camera setting interface 403 includes multiple menu bars, such as resolution, geographic location, automatic watermarking, AI photographer, reference line, shooting sound, smiling face capture, etc.
- the embodiment of the application no longer has the functions of the above menus. Go into details.
- the user on the camera setting interface 403, the user is provided with a switch for starting the telephoto shooting function—an auxiliary viewfinder screen.
- a switch for starting the telephoto shooting function—an auxiliary viewfinder screen the user clicks on the auxiliary framing screen switch, so that the auxiliary framing screen switch is turned on ("ON"), that is, the telephoto shooting provided by the embodiment of the application is turned on Function.
- the preview interface 404 also displays an auxiliary viewfinder 50.
- the auxiliary framing frame 50 can display the framing screen at "5 ⁇ " in Figure (b), and the image preview display area 40 displays the magnified framing screen.
- the embodiments of this application describe the process of telephoto shooting with a mobile phone.
- the camera enters the telephoto shooting mode, that is, the view is taken through the telephoto lens, and as the magnification increases, the camera enters the telephoto shooting mode.
- the view is taken through the telephoto lens, and the preview images of different magnifications are obtained by cropping the viewfinder screen of the telephoto lens at the time of "5 ⁇ ".
- the examples of this application take photos below "5 ⁇ ”
- the process and the process of obtaining the preview image are not limited.
- the auxiliary viewfinder frame 50 is continuously displayed on the preview interface 404. If the user has not adjusted the magnification at this time, that is, when the magnification is “1 ⁇ ”, the auxiliary viewing frame 50 may have the same preview image as the image preview display area 40, and both are the viewing images obtained by the main lens of the mobile phone.
- the auxiliary framing frame 50 can display the framing screen at “5 ⁇ ” in Figure (b), and the image preview display area 40 displays the zoomed in to “50 ⁇ ” Viewfinder screen.
- the auxiliary viewfinder 50 when the user turns on the auxiliary viewfinder image switch, the user returns to the preview interface 404 of the camera application, and the auxiliary viewfinder frame 50 is not displayed.
- the auxiliary viewfinder 50 is displayed on the preview interface 404.
- the user turns on the auxiliary viewfinder screen switch and returns to the preview interface 404 of the camera application.
- the current magnification is "1 ⁇ "
- the auxiliary viewfinder 50 is not displayed.
- the auxiliary framing frame 50 is displayed on the preview interface 404, and the auxiliary framing frame 50 can display the framing screen at "5 ⁇ ” in Figure (b), and the image preview display area 40 displays the framing after zooming in "10 ⁇ ” Picture.
- the switch for starting the telephoto shooting function can be set at any position of the preview interface 402 of the camera application.
- the auxiliary viewfinder screen switch is set in the top menu area 10, and the user does not need to use the setting button to turn on the telephoto shooting function, and directly click the auxiliary viewfinder screen switch in the top menu area 10 to start the telephoto shooting function and display the auxiliary viewfinder Box 50.
- the telephoto shooting function is turned on by default, and the auxiliary viewfinder screen switch is not set in the camera application.
- the preview interface 402 of the camera application is automatically displayed Auxiliary viewfinder 50.
- the preset condition may be to determine that the user's sliding adjustment magnification is greater than or equal to "10 ⁇ ".
- the auxiliary finder frame 50 is automatically displayed on the preview interface 402 of the camera application, and the auxiliary finder frame 50 can display "5 ⁇ ” in Figure (b)
- the framing screen of the image preview display area 40 displays the framing screen enlarged by "10 ⁇ ".
- the auxiliary viewfinder frame 50 provided can be used to display the viewfinder image obtained by the telephoto lens of the mobile phone when displayed at "5 ⁇ ", and the image preview display area 40 displays that the user has adjusted the magnification Viewfinder screen.
- this method can simultaneously display two view frames in the image preview display area 40, and the two view frames provide the user with preview images with two different angles of view.
- this method can make it easier for users to find the subject of the shooting target during telephoto shooting.
- the auxiliary viewfinder frame 50 includes a close button 70
- the close button 70 may be located at the upper right corner of the auxiliary viewfinder frame 50, and the user can close the auxiliary viewfinder frame 50 by clicking the close button 70.
- the close button 70 For example, when the user finishes focusing and the subject of the shooting target is displayed in the image preview area 40, the user can close the auxiliary viewfinder 50, so that a larger image preview area can be provided to facilitate the user to view the image preview screen.
- the auxiliary framing frame 50 includes a target area, and the target area is used to determine the shooting target subject.
- the auxiliary framing frame 50 includes a target area 60, and the target area 60 is used to determine the “trees at the top of the mountain” that the user desires to photograph.
- the user adjusts the magnification to "50 ⁇ " for telephoto shooting, the user moves the "trees at the top of the mountain” to the target area 60 through the auxiliary viewfinder 50, adjusts the angle of the mobile phone lens, etc., and the image preview can be guaranteed
- the display area 40 presents the trees at the top of the mountain peak enlarged by "50 ⁇ ".
- the FOV of the phone When zooming in to “50 ⁇ ” for shooting, the FOV of the phone is only about 0.5 degrees. If the user rotates the phone slightly or the camera shakes, the shake and offset will be magnified, resulting in a large shift in the image preview display area 40. The shooting target will shift to the image preview display area 40, and it is difficult for the user to retrieve the shooting target at this time.
- the auxiliary framing frame 50 Through the auxiliary framing frame 50, the user can intuitively understand and adjust the manual adjustment of the angle of the mobile phone lens, etc., and move the "trees at the top of the mountain" to the target area 60, so that the shooting target "trees at the top of the mountain” is displayed Go to the image preview display area 40. This method can therefore improve operating efficiency.
- the auxiliary viewfinder frame 50 is displayed in the lower left corner of the image preview display area 40 of the mobile phone, and the target area 60 may be located at the center of the auxiliary viewfinder frame 50.
- the auxiliary finder frame 50 is located at the lower left corner of the image preview display area 40, and the target area 60 may be located in the central area of the auxiliary finder frame 50.
- This display mode can reduce the The auxiliary framing frame 50 blocks the image preview display area 40 and facilitates the user to quickly display “trees at the top of the mountain” in the image preview display area 40.
- FIG. 5 is a schematic diagram of a graphical user interface of another example of a telephoto shooting process provided by an embodiment of the present application.
- the user clicks on any position of the auxiliary finder frame 50 and drags the auxiliary finder frame 50 in the upward direction.
- the auxiliary finder frame 50 moves from the lower left corner to the upper left corner of the image preview display area 40.
- the user can move the auxiliary framing frame 50 to minimize the occlusion of the auxiliary framing frame 50 on the image displayed in the image preview display area 40.
- the position of the target area 60 in the auxiliary framing frame 50 can also be moved according to a user's operation.
- the shooting angle of the mobile phone has been fixed, so that in the auxiliary framing frame 50, the shooting target is not located in the central area of the auxiliary framing frame 50, as shown in (b) in Figure 5 )
- the shooting target "trees at the top of the mountain peak" is displayed in the lower left area of the auxiliary viewfinder frame 50.
- the preview image of the trees at the top of the mountain peak cannot be displayed in the image preview display area 40.
- the user can perform the operation shown in Figure 5 (b), click any position of the target area 60, and drag the target area 60 to the lower left, so that the target area 60 includes the shooting target—" Trees on the top of the mountain”.
- the position of the target area 60 in the auxiliary framing frame 50 changes.
- the image preview display area 40 may display the trees at the top of the mountain that the user desires to photograph .
- the picture of the image preview area 40 is obtained after the picture of the target area 60 in the auxiliary finder frame 50 is enlarged.
- the auxiliary view frame 50 is the picture captured by the mobile phone's telephoto lens at “5 ⁇ ”, and the camera application can determine The position of the target area 60 in the auxiliary framing frame 50 and the range of the frame enclosed by the target area 60 are determined, and then the image data contained in the frame of the target area 60 is determined.
- the camera application runs on the processor 110, that is, the processor 110 can learn the location information of the target area 60, the processor 110 sends the location information of the target area 60 to the ISP module 330, and the ISP module 330 determines the location of the target area 60.
- the information performs cropping processing on the picture acquired by the telephoto lens at "5 ⁇ " to obtain a new picture, and the image data after the processing of magnifying the new picture by “50 ⁇ ” is passed to the HAL and application layer for display. Finally, the screen of the target area 60 is displayed in the image preview area 40 after being enlarged "50 ⁇ ".
- the camera application can determine the position information of the target area 60, and further enlarge the image included in the target area 60, etc. I won't repeat it here.
- the user can move the target area 60 in the auxiliary framing frame 50 according to the current shooting requirements, so that the user can quickly find the shooting target without changing the shooting angle of the mobile phone or the lens.
- Fig. 6 is an implementation flowchart of an example of a telephoto shooting process provided by an embodiment of the present application.
- the implementation process of the method 600 can be divided into three stages, including: an ISP processing stage, a HAL processing stage, and an application layer processing stage.
- the ISP processing stage can be executed by the ISP module 330;
- the HAL processing stage and the application layer processing stage are executed by the processor 110, and the HAL processing stage corresponds to the execution of the HAL.
- the application layer processing stage is executed at the application layer.
- the method 600 includes:
- the phone enters the telephoto shooting mode.
- the “telephoto shooting mode” introduced in the embodiments of the present application can be understood as the current preview image obtained by the telephoto lens of the mobile phone.
- the original framing screen can be obtained through the telephoto lens of the mobile phone, and the preview image at “25 ⁇ ”, “50 ⁇ ” and other magnifications is obtained from “5 ⁇ ” Start from the center of the original framing screen, which is obtained after cropping and other processing.
- the telephoto shooting mode is not limited to the mode in which the mobile phone is in the camera mode or the video recording mode.
- the phone can take photos or record videos in telephoto shooting mode.
- the user can adjust the magnification of area 30 by sliding the mobile phone magnification.
- the magnification reaches the first threshold (for example, "5 ⁇ "
- the mobile phone automatically switches the lens into the telephoto shooting mode through the telephoto lens of the mobile phone. Get the original viewfinder screen.
- the telephoto shooting mode can also be entered through other methods such as user settings, which is not limited in the embodiment of the present application.
- the ISP module 330 of the mobile phone generates a first preview stream.
- the first preview stream can also be referred to as the "original preview stream", which can be understood as: the optical signal acquired by the lens 310 is sequentially processed by the image sensor 320 and the ISP module 330, and the output Image data in the RGB space domain.
- the image data can be sent to the display unit of the mobile phone in the form of a data stream.
- the surface manager and the 3D graphics processing library in the system library of Figure 2 are sent to complete image rendering, synthesis and layer processing, etc., and finally display In the image preview area of the mobile phone display.
- the ISP module 330 of the mobile phone determines whether it is currently digital zoom.
- digital zoom can be understood as: if in the telephoto shooting process, the image acquired is the original viewfinder image acquired at a certain magnification (for example, "5 ⁇ ") after cropping, etc. The result is a digital zoom process.
- the ISP module 330 of the mobile phone determines whether the current digital zoom is based on whether the current magnification is greater than or equal to the second threshold.
- the second threshold may be equal to or greater than the aforementioned first threshold.
- the process of acquiring an image is a digital zoom process.
- the second threshold is greater than the first threshold, for example, the first threshold is “5 ⁇ ” and the second threshold is "10 ⁇ ", that is, when the magnification is "5 ⁇ ", the mobile phone enters telephoto shooting Mode; when the magnification is "10 ⁇ ", the process of acquiring the image is a digital zoom process.
- the ISP module 330 of the mobile phone determines that the current method of acquiring the image is not a digital zoom process, it uses the first preview stream as the main preview stream, and only sends the first preview stream to the HAL.
- main preview stream here can be understood as: image data used to be displayed in the RGB spatial domain of the image preview display area 40; in the same way, in this application, it may be displayed in the RGB spatial domain of the auxiliary viewfinder 50
- the image data is called "auxiliary preview stream”.
- the above steps 601-604 are the ISP processing procedures completed by the ISP module 330, and the ISP module 330 sends the first preview stream to the HAL.
- 605 Perform image quality optimization processing on the main preview stream in the HAL of the mobile phone, and send the processed main preview stream image data to the application layer.
- the HAL of the mobile phone can receive the control of the processor 110 to superimpose the image quality optimization algorithm for the first preview stream, such as color temperature adjustment, noise reduction processing, smoothing processing, white balance processing, and other optimization algorithms.
- the types and methods are not limited.
- the camera application on the application layer of the mobile phone draws a main preview screen in the image preview area.
- step 606 can be understood as that the ISP module 330 of the mobile phone transfers the image data of the first preview stream to the processor 110, and the processor 110 controls the surface manager, the 3D graphics processing library, etc. in the system library of FIG. Image rendering, synthesis and layer processing, etc., are finally displayed in the image preview area of the mobile phone display.
- the camera application of the mobile phone determines whether the auxiliary preview condition is currently met.
- the judgment process of this step 607 can reach a consensus between the ISP module 330 and the application layer.
- the ISP module 330 determines that the current method of acquiring images is not a digital zoom process and does not meet the auxiliary preview conditions.
- the camera application may determine that the auxiliary preview conditions are not satisfied, or, When the ISP module 330 determines that the current method of acquiring the image is not a digital zoom process, the camera application may not perform this step 607.
- the above steps 601-604 are the ISP processing flow completed by the ISP module 330.
- the first preview stream is processed in the HAL processing stage of step 605, and the main preview screen is displayed in the image preview area through step 606, it can be used in the camera application
- the image preview area outputs the image visible to the user.
- the above is the entire process for the mobile phone to display the "1 ⁇ " preview screen normally.
- step 603 If in step 603, the ISP module 330 of the mobile phone determines that the current method of acquiring the image is a digital zoom process, the ISP module 330 generates a second preview stream.
- the ISP module 330 of the mobile phone determines that the current method of acquiring the image is a digital zoom process.
- the second preview stream can also be called “digital zoom preview stream”, which can be understood as: starting from the center of the original framing screen obtained at "5 ⁇ ", the framing screen after cropping and other processing, corresponding to the output RGB space
- the image data of the domain For example, when the user adjusts the magnification to "10 ⁇ " for shooting, the viewfinder screen when the second preview stream is "10 ⁇ " corresponds to the output image data.
- the ISP module 330 of the mobile phone determines whether the auxiliary preview condition is currently met.
- the judgment process of this step 610 may be: whether the user has turned on the auxiliary viewfinder picture switch.
- the user manually turns on the auxiliary viewfinder screen switch, that is, the ISP module 330 of the mobile phone determines that the auxiliary preview condition is currently satisfied.
- the telephoto shooting function is turned on by default, and the user may not manually turn on the auxiliary viewfinder screen switch.
- the ISP module 330 of the mobile phone determines that the auxiliary preview condition is currently satisfied.
- the preset condition may be that the mobile phone determines that the current shooting magnification is greater than or equal to "10 ⁇ ".
- the mobile phone's ISP module 330 determines that the auxiliary preview condition is currently met, and the auxiliary viewfinder frame 50 is automatically displayed on the preview interface of the camera application, and proceeds to step 612; otherwise, the mobile phone's ISP module 330 determines that the auxiliary preview is not currently satisfied Conditionally, the auxiliary viewfinder frame 50 is automatically displayed on the preview interface of the camera application, and the process goes to step 611.
- the ISP module 330 of the mobile phone uses the second preview stream as the main preview stream, and only sends the second preview stream.
- step 605 step 606, and step 608 only use the second preview stream as the main preview stream, and perform image quality optimization processing on the second preview stream through HAL, and the processed data stream is sent to the display unit of the mobile phone.
- the main preview screen is displayed in the image preview area of the camera application on the application layer of the mobile phone, that is, an image visible to the user is output in the image preview area of the camera application. Take the user shooting at the magnification "9 ⁇ " as an example. The above is the entire process of the enlarged preview screen when the mobile phone displays "9 ⁇ ".
- the ISP module 330 of the mobile phone uses the second preview stream as the main preview stream and the first preview stream as the auxiliary preview stream, and at the same time sends the first preview stream and The second preview stream.
- the ISP module 330 of the mobile phone determines that the auxiliary preview condition is currently satisfied, as shown in (d) in FIG. 4, or as shown in (a) and (b) in FIG. 5, the preview of the camera application
- the auxiliary viewfinder 50 can be automatically displayed on the interface.
- the image preview display area 40 of the mobile phone displays the enlarged preview screen when “50 ⁇ ” is displayed
- the preview screen corresponds to the image data contained in the "second preview stream” (or called the “main preview stream”);
- the auxiliary view frame 50 of the mobile phone displays the original view frame when "5 ⁇ ", the auxiliary view frame 50
- the original viewfinder screen corresponds to the image data contained in the "first preview stream” (or called the "secondary preview stream”).
- the ISP module 330 of the mobile phone sends the first preview stream and the second preview stream to the HAL at the same time, that is, the image data of the dual preview stream is reported.
- the image data reporting process of the dual-channel preview stream can be sent at the same time. It should be understood that the embodiment of the present application does not limit the image data reporting process of the dual-channel preview stream.
- step 613 Perform image quality optimization processing on the first preview stream and the second preview stream on the HAL of the mobile phone. For this process, reference may be made to the related description of step 605, which is not repeated here.
- the camera application on the application layer of the mobile phone draws a main preview screen in the image preview area.
- the camera application of the mobile phone determines whether the auxiliary preview condition is currently met.
- the judgment process of this step 607 can refer to the judgment process of the ISP module 330 in step 610.
- the judgment process can reach a consensus between the ISP module 330 and the application layer, which is shown by a dashed box in FIG. 6. It should be understood that step 607 may be consistent with step 610, and the camera application may not execute.
- the camera application can also know that the auxiliary preview condition is currently satisfied, so that in step 606, the processed image data of the second preview stream is sent to the application layer.
- Draw the main preview screen that is, the enlarged preview screen when "50 ⁇ " is displayed in the image preview display area 40; at the same time, through step 614, the processed image data of the first preview stream is sent to the application layer, and the auxiliary preview is drawn
- the picture is the original framing picture when "5 ⁇ " is displayed in the auxiliary framing frame 50.
- the camera application may also know that the auxiliary preview condition is not currently met, and thus through step 606, send the processed image data of the second preview stream to the application program Layer, the main preview screen is drawn, that is, the enlarged preview screen when "50 ⁇ " is displayed in the image preview display area 40, and the auxiliary viewfinder frame 50 is not displayed.
- the main preview screen after zooming in "50 ⁇ ” can be displayed in the image preview area, and at the same time through the auxiliary viewfinder 50
- the original framing screen when "5 ⁇ ” is displayed.
- the auxiliary framing frame 50 is used to display the original framing screen obtained by the telephoto lens of the mobile phone when displayed at "5 ⁇ "-the entire mountain peak, and the image preview is displayed
- the area 40 displays the viewfinder screen after the user adjusts the magnification "50 ⁇ "-the trees on the top of the mountain.
- this method can simultaneously display two view frames in the image preview display area 40, and the two view frames provide the user with preview images with two different angles of view.
- this method can make it easier for users to find the subject of the shooting target during telephoto shooting.
- the display of the target area 60 in the auxiliary viewfinder frame 50 can be controlled by the camera application.
- the target area 60 can be automatically displayed for the user By adjusting the angle of the lens of the mobile phone, etc., the subject of the shooting target is moved to the target area 60, so that the subject of the shooting target is easier to find.
- the position coordinates of the auxiliary framing frame 50 on the display screen can be determined in a variety of ways.
- Figure 7 is a schematic diagram of the coordinates of an example of a mobile phone interface provided by an embodiment of the present application.
- the origin O0 of the upper left corner of the display screen is used as the origin of the coordinates
- the horizontal direction (X-axis direction) and The pixels in the vertical direction (Y-axis direction) represent the size of the display screen.
- the size of the mobile phone display screen is 640 ⁇ 960, which means that it contains 640 pixels in the X-axis direction and 960 pixels in the Y-axis direction.
- the numbers are the position coordinates of the pixel. It should be understood that the more pixels occupied in a certain direction, the finer and finer the display effect.
- the position coordinates of the auxiliary viewfinder 50 on the display screen may be preset coordinates in the camera application.
- the mobile phone may specify the initial vertex coordinate O, the number x of pixels included in the X-axis direction, and the number y of pixels included in the Y-axis direction for the auxiliary viewfinder frame 50.
- the initial vertex coordinates are O (0, 320)
- the number of pixels included in the X-axis direction is 100
- the auxiliary viewfinder frame 50 can be displayed in the lower right corner of the image preview area 40 of the display screen of the hand. The location is displayed as shown in Figure 4 (d).
- the mobile phone 100 may determine the coordinates and size of the auxiliary viewfinder frame 50 displayed on the image preview area 40 according to the image currently displayed in the image preview area 40 of the mobile phone camera application.
- the mobile phone detects the shooting target in the viewfinder screen of the image preview area 40—the trees on the top of the mountain are close to the right side of the mobile phone, and the blank area on the left except the shooting target is larger than the right area
- the mobile phone is the initial setting of the auxiliary viewfinder 50 It is displayed in the upper left corner area as shown in Figure 5 (a).
- This display mode may not block the right magnification adjustment area 30, which is convenient for the user to operate.
- the auxiliary viewfinder 50 of the mobile phone can receive a drag operation of the user to move on the image preview display area 40.
- the camera application can re-determine the new coordinates of the auxiliary viewfinder frame 50 and update the coordinate information of the auxiliary viewfinder frame 50 to ensure that the image data included in the first preview stream is accurately displayed in the auxiliary viewfinder frame. 50 in.
- the target area 60 in the auxiliary framing frame 50 can also accept the user's drag operation, the coordinate information of the changed target area 60 and the shooting defined by the target area 60
- the subject of the target can be acquired by the camera, so that in the digital zoom process, when cropping and other processing are performed on the acquired original framing screen, the subject of the shooting target in the target area 60 is accurately processed to further generate the second Preview the stream, I won’t repeat it here.
- the user can move the target area 60 in the auxiliary framing frame 50 according to the current shooting requirements, so that the user can quickly find the shooting target without changing the shooting angle of the mobile phone or the lens.
- an electronic device in order to implement the above-mentioned functions, includes hardware and/or software modules corresponding to each function.
- the present application can be implemented in the form of hardware or a combination of hardware and computer software. Whether a certain function is executed by hardware or computer software-driven hardware depends on the specific application and design constraint conditions of the technical solution. Those skilled in the art can use different methods for each specific application in combination with the embodiments to implement the described functions, but such implementation should not be considered as going beyond the scope of the present application.
- the electronic device can be divided into functional modules according to the above method examples.
- each functional module can be divided corresponding to each function, such as a detection unit, a processing unit, a display unit, etc., or two or more functions can be divided.
- Integrated in a processing module Integrated in a processing module.
- the above-mentioned integrated modules can be implemented in the form of hardware. It should be noted that the division of modules in this embodiment is illustrative, and is only a logical function division, and there may be other division methods in actual implementation.
- the electronic device provided in this embodiment is used to execute the above-mentioned method of telephoto shooting, and therefore can achieve the same effect as the above-mentioned realization method.
- the electronic device may also include a processing module, a storage module, and a communication module.
- the processing module can be used to control and manage the actions of the electronic device.
- the storage module can be used to support the electronic device to execute the stored program code and data.
- the communication module can be used to support the communication between electronic devices and other devices.
- the processing module may be a processor or a controller. It can implement or execute various exemplary logical blocks, modules, and circuits described in conjunction with the disclosure of this application.
- the processor may also be a combination that implements computing functions, such as a combination of one or more microprocessors, a combination of digital signal processing (DSP) and a microprocessor, and so on.
- the storage module may be a memory.
- the communication module may specifically be a radio frequency circuit, a Bluetooth chip, a Wi-Fi chip, and other devices that interact with other electronic devices.
- the electronic device involved in this embodiment may be a device having the structure shown in FIG. 1.
- This embodiment also provides a computer storage medium.
- the computer storage medium stores computer instructions.
- the computer instructions run on an electronic device, the electronic device executes the above-mentioned related method steps to implement the telephoto shooting in the above-mentioned embodiment. method.
- This embodiment also provides a computer program product, which when the computer program product runs on a computer, causes the computer to execute the above-mentioned related steps, so as to implement the telephoto shooting method in the above-mentioned embodiment.
- the embodiments of the present application also provide a device.
- the device may specifically be a chip, component or module.
- the device may include a processor and a memory connected to each other.
- the memory is used to store computer execution instructions.
- the processor can execute the computer-executable instructions stored in the memory, so that the chip executes the telephoto shooting method in the foregoing method embodiments.
- the electronic device, computer storage medium, computer program product, or chip provided in this embodiment are all used to execute the corresponding method provided above. Therefore, the beneficial effects that can be achieved can refer to the corresponding method provided above. The beneficial effects of the method will not be repeated here.
- the disclosed device and method may be implemented in other ways.
- the device embodiments described above are merely illustrative.
- the division of modules or units is only a logical function division.
- there may be other division methods for example, multiple units or components may be combined or It can be integrated into another device, or some features can be ignored or not implemented.
- the displayed or discussed mutual coupling or direct coupling or communication connection may be indirect coupling or communication connection through some interfaces, devices or units, and may be in electrical, mechanical or other forms.
- the units described as separate parts may or may not be physically separate, and the parts displayed as units may be one physical unit or multiple physical units, that is, they may be located in one place, or they may be distributed to multiple different places. Some or all of the units may be selected according to actual needs to achieve the objectives of the solutions of the embodiments.
- the functional units in the various embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units may be integrated into one unit.
- the above-mentioned integrated unit can be implemented in the form of hardware or software functional unit.
- the integrated unit is implemented in the form of a software functional unit and sold or used as an independent product, it can be stored in a readable storage medium.
- the technical solutions of the embodiments of the present application are essentially or the part that contributes to the prior art, or all or part of the technical solutions can be embodied in the form of a software product, and the software product is stored in a storage medium. It includes a number of instructions to make a device (may be a single-chip microcomputer, a chip, etc.) or a processor (processor) execute all or part of the steps of the methods in the embodiments of the present application.
- the aforementioned storage media include: U disk, mobile hard disk, read only memory (read only memory, ROM), random access memory (random access memory, RAM), magnetic disk or optical disk and other media that can store program codes.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Human Computer Interaction (AREA)
- Studio Devices (AREA)
Abstract
本申请提供了一种长焦拍摄的方法和电子设备,该方法可以应用于平板、手表等具有数字变焦和长焦拍摄能力的电子设备。在长焦拍摄过程中,该方法通过同时显示两个取景画面,通过辅助取景窗口显示"5×"时的原始取景画面,在图像预览显示区域显示用户调节放大倍数"50×"后的取景画面,为用户提供具有两个不同视场角的预览画面,用户可以在长焦拍摄过程中,根据辅助取景窗口显示"5×"时的原始取景画面,更容易找到拍摄目标的主体,提高长焦拍摄体验。
Description
本申请要求于2020年1月23日提交中国专利局、申请号为202010077034.6、申请名称为“一种长焦拍摄的方法及电子设备”的中国专利申请的优先权,其全部内容通过引用结合在本申请中。
本申请涉及电子技术领域,尤其涉及一种长焦拍摄的方法及电子设备。
随着终端技术的不断发展,拍摄功能已成为终端设备的重要特征和评价终端设备性能的主要指标。用户在使用手机等便携式终端设备进行拍摄时,会有超长焦拍摄的需求。终端设备的超长焦拍摄过程中,依靠安装的长焦定焦镜头,对长焦定焦镜头的拍摄画面围绕中心点做裁切,并进行画质优化,获得更大倍数的长焦取景。
示例性的,以手机的超长焦拍摄获取放大50倍的取景画面的过程为例,对手机长焦定焦镜头拍摄的5倍取景画面进行裁切,通过数字变焦过程,获取放大50倍的取景画面。由于手机镜头轴心固定不可转动,在拍摄过程中,用户要找到拍摄目标主体,需要将手机对准拍摄目标的方向。但是,50倍放大后的取景画面的面积只有5倍镜头的取景画面面积的1/100,取景范围等效于1340mm焦距单反镜头,视场角(field of view,FOV)仅为0.5度左右,取景画面的覆盖面积很小。因此,超长焦拍摄会放大抖动,使得取景画面偏移;且当取景画面大范围偏移后,很难找回拍摄目标。
发明内容
本申请提供一种长焦拍摄的方法和电子设备,该方法通过同时显示两个取景画面,在长焦拍摄过程中,用户可以更容易找到拍摄目标的主体,提高拍摄体验。
第一方面,提供了一种长焦拍摄的方法,应用于包括镜头的电子设备,该方法包括:显示该电子设备的相机的拍摄界面,该拍摄界面包括图像预览区域,该图像预览区域显示第一预览画面;检测到第一操作,响应于该第一操作,该电子设备在该拍摄界面同时显示该图像预览区域和辅助预览窗口,该辅助预览窗口显示第二预览画面;其中,该第一预览画面和该第二预览画面是通过该镜头获取的,且该第一预览画面是第一放大倍数下的取景画面,该第二预览画面是第二放大倍数下的取景画面,该第一预览画面是根据该第二预览画面进行裁切处理得到的,该第一放大倍数大于或等于该第二放大倍数。
通过上述方案,用户可以根据当前的拍摄需求,例如在长焦拍摄过程中,该方法通过同时显示两个取景画面,通过辅助取景窗口显示“5×”时的原始取景画面,在图像预览显示区域显示用户调节放大倍数“50×”后的取景画面,为用户提供具有两个不同视场角的预览画面,用户可以在长焦拍摄过程中,根据辅助取景窗口显示“5×”时的原始取景画面, 在不改变手机或者镜头的拍摄角度的情况下,更容易找到拍摄目标的主体,提高长焦拍摄体验。
结合第一方面,在第一方面的某些实现方式中,该图像预览区域和该辅助预览窗口的显示为以下任意一种:该辅助预览窗口的至少部分区域和该图像预览区域重叠;或者该辅助预览窗口显示在该图像预览区域上之外的位置;或者该辅助预览窗口位于该图像预览区域的左下角区域。
通过上述方法,该辅助预览窗口可以在相机应用的界面上有不同的显示方式,可以适应不同的拍摄场景,提高用户拍摄体验。例如,该辅助预览窗口和图像预览区域不重叠,使得用户可以看见更大的预览画面,提高拍摄体验。
结合第一方面和上述实现方式,在第一方面的某些实现方式中,该第一操作是用户对放大倍数调节区域的放大倍数的调节操作,该响应于该第一操作,该电子设备在该拍摄界面显示辅助预览窗口,包括:当检测到该第一放大倍数大于或者等于第一阈值时,该电子设备自动在该拍摄界面显示该辅助预览窗口。
可选地,用户可以通过滑动手机放大倍数调节区域30的放大倍数,当放大倍数达到第一阈值(例如“5×”)时,手机自动切换镜头进入长焦拍摄模式,通过手机的长焦镜头的获取原始取景画面。或者,还可以通过用户设置等其他方式进入长焦拍摄模式,本申请实施例对此不做限定。
结合第一方面和上述实现方式,在第一方面的某些实现方式中,该第一操作是用户在相机应用开启该辅助预览窗口的操作。
示例性的,相机应用中可以包含将启动长焦拍摄功能的开关——辅助取景画面开关。例如,辅助取景画面开关设置在顶端菜单区域,用户直接点击在顶端菜单区域的辅助取景画面开关,就可以启动长焦拍摄功能,显示辅助取景框。或者,辅助取景画面开关包含在设置菜单中,用户可以通过设置菜单启动长焦拍摄功能,显示辅助取景框。
结合第一方面和上述实现方式,在第一方面的某些实现方式中,该辅助预览窗口还包括关闭按键,该电子设备在该拍摄界面显示辅助预览窗口之后,该方法还包括:若该电子设备检测到用户对该关闭按钮的操作,该电子设备关闭该辅助预览窗口。
例如,当用户完成对焦,拍摄目标的主体显示在图像预览区域时,用户可以关闭该辅助取景框,从而可以有更大的图像预览区域,方便用户查看图像预览画面。
结合第一方面和上述实现方式,在第一方面的某些实现方式中,该辅助预览窗口还包括目标区域,该第一预览画面的图像是该目标区域的图像经过处理得到的,以及该目标区域为该辅助预览窗口中的固定区域,或者,该目标区域为该辅助预览窗口中的任意区域。
结合第一方面和上述实现方式,在第一方面的某些实现方式中,该方法还包括:检测到针对该辅助预览窗口的拖动操作,响应于该拖动操作,该辅助预览窗口从第一位置移动到该第二位置。
通过上述方法,根据当前图像预览区域中显示的图像,用户可以移动该辅助取景框,最大限度的降低该辅助取景框对图像预览显示区域显示的图像的遮挡。
第二方面,提供了一种电子设备,包括:镜头,用于获取待拍摄的画面;一个或多个处理器;存储器;多个应用程序;以及一个或多个程序,其中该一个或多个程序被存储在该存储器中,当该一个或者多个程序被该处理器执行时,使得该电子设备执行以下步骤: 显示相机的拍摄界面,该拍摄界面包括图像预览区域,该图像预览区域显示第一预览画面;检测到第一操作,响应于该第一操作,在该拍摄界面同时显示该图像预览区域和辅助预览窗口,该辅助预览窗口显示第二预览画面;其中,该第一预览画面和该第二预览画面是通过该镜头获取的,且该第一预览画面是第一放大倍数下的取景画面,该第二预览画面是第二放大倍数下的取景画面,该第一预览画面是根据该第二预览画面进行裁切处理得到的,该第一放大倍数大于或等于该第二放大倍数。
结合第二方面,在第二方面的某些实现方式中,该图像预览区域和该辅助预览窗口的显示为以下任意一种:该辅助预览窗口的至少部分区域和该图像预览区域重叠;或者该辅助预览窗口显示在该图像预览区域上之外的位置;或者该辅助预览窗口位于该图像预览区域的左下角区域。
结合第二方面和上述实现方式,在第二方面的某些实现方式中,该第一操作是用户对放大倍数调节区域的放大倍数的调节操作,当该一个或者多个程序被该处理器执行时,使得该电子设备执行以下步骤:当检测到该第一放大倍数大于或者等于第一阈值时,自动在该拍摄界面显示该辅助预览窗口。
结合第二方面和上述实现方式,在第二方面的某些实现方式中,该第一操作是用户在相机应用开启该辅助预览窗口的操作。
结合第二方面和上述实现方式,在第二方面的某些实现方式中,该辅助预览窗口还包括关闭按键,当该一个或者多个程序被该处理器执行时,使得该电子设备执行以下步骤:若检测到用户对该关闭按钮的操作,关闭该辅助预览窗口。
结合第二方面和上述实现方式,在第二方面的某些实现方式中,该辅助预览窗口还包括目标区域,该第一预览画面的图像是该目标区域的图像经过处理得到的,以及该目标区域为该辅助预览窗口中的固定区域,或者,该目标区域为该辅助预览窗口中的任意区域。
结合第二方面和上述实现方式,在第二方面的某些实现方式中,当该一个或者多个程序被该处理器执行时,使得该电子设备执行以下步骤:检测到针对该辅助预览窗口的拖动操作,响应于该拖动操作,该辅助预览窗口从第一位置移动到该第二位置。
第三方面,本申请提供了一种装置,该装置包含在电子设备中,该装置具有实现上述方面及上述方面的可能实现方式中电子设备行为的功能。功能可以通过硬件实现,也可以通过硬件执行相应的软件实现。硬件或软件包括一个或多个与上述功能相对应的模块或单元。例如,显示模块或单元、检测模块或单元、处理模块或单元等。
第四方面,本申请提供了一种电子设备,包括:触摸显示屏,其中,触摸显示屏包括触敏表面和显示器;摄像头;一个或多个处理器;存储器;多个应用程序;以及一个或多个计算机程序。其中,一个或多个计算机程序被存储在存储器中,一个或多个计算机程序包括指令。当指令被电子设备执行时,使得电子设备执行上述任一方面任一项可能的实现中的长焦拍摄的方法。
第五方面,本申请提供了一种电子设备,包括一个或多个处理器和一个或多个存储器。该一个或多个存储器与一个或多个处理器耦合,一个或多个存储器用于存储计算机程序代码,计算机程序代码包括计算机指令,当一个或多个处理器执行计算机指令时,使得电子设备执行上述任一方面任一项可能的实现中的长焦拍摄的方法。
第六方面,本申请提供了一种计算机可读存储介质,包括计算机指令,当计算机指令 在电子设备上运行时,使得电子设备执行上述任一方面任一项可能的视频播放的方法。
第七方面,本申请提供了一种计算机程序产品,当计算机程序产品在电子设备上运行时,使得电子设备执行上述任一方面任一项可能的长焦拍摄的方法。
图1是本申请实施例提供的一例电子设备的结构示意图。
图2是本申请实施例的电子设备100的软件结构框图。
图3是本申请实施例的一例电子设备拍照过程的控制结构示意图。
图4是本申请实施例提供的一例长焦拍摄过程的图形用户界面示意图。
图5是本申请实施例提供的又一例长焦拍摄过程的图形用户界面示意图。
图6是本申请实施例提供的一例长焦拍摄过程的实现流程图。
图7是本申请实施例提供的一例手机界面的坐标示意图。
下面将结合本申请实施例中的附图,对本申请实施例中的技术方案进行描述。其中,在本申请实施例的描述中,除非另有说明,“/”表示或的意思,例如,A/B可以表示A或B;本文中的“和/或”仅仅是一种描述关联对象的关联关系,表示可以存在三种关系,例如,A和/或B,可以表示:单独存在A,同时存在A和B,单独存在B这三种情况。另外,在本申请实施例的描述中,“多个”是指两个或多于两个。
以下,术语“第一”、“第二”仅用于描述目的,而不能理解为指示或暗示相对重要性或者隐含指明所指示的技术特征的数量。由此,限定有“第一”、“第二”的特征可以明示或者隐含地包括一个或者更多个该特征,例如本申请实施例中描述的“第一预览流”、“第二预览流”。
本申请实施例提供的拍摄方法可以应用于手机、平板电脑、可穿戴设备、车载设备、增强现实(augmented reality,AR)/虚拟现实(virtual reality,VR)设备、笔记本电脑、超级移动个人计算机(ultra-mobile personal computer,UMPC)、上网本、个人数字助理(personal digital assistant,PDA)等可以实现拍摄功能的电子设备上,本申请实施例对电子设备的具体类型不作任何限制。
应理解,该电子设备安装有镜头,例如长焦镜头,可以是长焦定焦镜头,也可以是支持未来可能出现的长焦变焦镜头等,本申请对镜头的形式不做限定。
示例性的,图1是本申请实施例提供的一例电子设备100的结构示意图。电子设备100可以包括处理器110,外部存储器接口120,内部存储器121,通用串行总线(universal serial bus,USB)接口130,充电管理模块140,电源管理模块141,电池142,天线1,天线2,移动通信模块150,无线通信模块160,音频模块170,扬声器170A,受话器170B,麦克风170C,耳机接口170D,传感器模块180,按键190,马达191,指示器192,摄像头193,显示屏194,以及用户标识模块(subscriber identification module,SIM)卡接口195等。其中传感器模块180可以包括压力传感器180A,陀螺仪传感器180B,气压传感器180C,磁传感器180D,加速度传感器180E,距离传感器180F,接近光传感器180G,指纹传感器180H,温度传感器180J,触摸传感器180K,环境光传感器180L,骨传导传感器180M 等。
可以理解的是,本申请实施例示意的结构并不构成对电子设备100的具体限定。在本申请另一些实施例中,电子设备100可以包括比图示更多或更少的部件,或者组合某些部件,或者拆分某些部件,或者不同的部件布置。图示的部件可以以硬件,软件或软件和硬件的组合实现。
处理器110可以包括一个或多个处理单元,例如:处理器110可以包括应用处理器(application processor,AP),调制解调处理器,图形处理器(graphics processing unit,GPU),图像信号处理器(image signal processor,ISP),控制器,存储器,视频编解码器,数字信号处理器(digital signal processor,DSP),基带处理器,和/或神经网络处理器(neural-network processing unit,NPU)等。其中,不同的处理单元可以是独立的器件,也可以集成在一个或多个处理器中。
其中,控制器可以是电子设备100的神经中枢和指挥中心。控制器可以根据指令操作码和时序信号,产生操作控制信号,完成取指令和执行指令的控制。
处理器110中还可以设置存储器,用于存储指令和数据。在一些实施例中,处理器110可以包括一个或多个接口。接口可以包括集成电路(inter-integrated circuit,I2C)接口,集成电路内置音频(inter-integrated circuit sound,I2S)接口,脉冲编码调制(pulse code modulation,PCM)接口,通用异步收发传输器(universal asynchronous receiver/transmitter,UART)接口,移动产业处理器接口(mobile industry processor interface,MIPI),通用输入输出(general-purpose input/output,GPIO)接口,用户标识模块(subscriber identity module,SIM)接口,和/或通用串行总线(universal serial bus,USB)接口等。
其中,I2C接口是一种双向同步串行总线,处理器110可以通过I2C接口耦合触摸传感器180K,使处理器110与触摸传感器180K通过I2C总线接口通信,实现电子设备100的触摸功能。MIPI接口可以被用于连接处理器110与显示屏194,摄像头193等外围器件。MIPI接口包括摄像头串行接口(camera serial interface,CSI),显示屏串行接口(display serial interface,DSI)等。在一些实施例中,处理器110和摄像头193通过CSI接口通信,实现电子设备100的拍摄功能。处理器110和显示屏194通过DSI接口通信,实现电子设备100的显示功能。
可以理解的是,本申请实施例示意的各模块间的接口连接关系,只是示意性说明,并不构成对电子设备100的结构限定。在本申请另一些实施例中,电子设备100也可以采用上述实施例中不同的接口连接方式,或多种接口连接方式的组合。
充电管理模块140用于从充电器接收充电输入。其中,充电器可以是无线充电器,也可以是有线充电器。电源管理模块141用于连接电池142,充电管理模块140与处理器110。电源管理模块141接收电池142和/或充电管理模块140的输入,为处理器110,内部存储器121,外部存储器,显示屏194,摄像头193,和无线通信模块160等供电。
电子设备100通过GPU,显示屏194,以及应用处理器等实现显示功能。GPU为图像处理的微处理器,连接显示屏194和应用处理器。GPU用于执行数学和几何计算,用于图形渲染。处理器110可包括一个或多个GPU,其执行程序指令以生成或改变显示信息。
显示屏194用于显示图像,视频等。显示屏194包括显示面板。显示面板可以采用液 晶显示屏(liquid crystal display,LCD),有机发光二极管(organic light-emitting diode,OLED),有源矩阵有机发光二极体或主动矩阵有机发光二极体(active-matrix organic light emitting diode的,AMOLED),柔性发光二极管(flex light-emitting diode,FLED),Miniled,MicroLed,Micro-oLed,量子点发光二极管(quantum dot light emitting diodes,QLED)等。在一些实施例中,电子设备100可以包括1个或N个显示屏194,N为大于1的正整数。
电子设备100可以通过ISP,摄像头193,触摸传感器、视频编解码器,GPU,显示屏194以及应用处理器等实现拍摄功能。例如本申请实施例介绍的长焦拍摄过程。
其中,ISP用于处理摄像头193反馈的数据。例如,拍照时,打开快门,光线通过镜头被传递到摄像头感光元件上,光信号转换为电信号,摄像头感光元件将所述电信号传递给ISP处理,转化为肉眼可见的图像。ISP还可以对图像的噪点,亮度,肤色进行算法优化。ISP还可以对拍摄场景的曝光,色温等参数优化。在一些实施例中,ISP可以设置在摄像头193中。
摄像头193用于捕获静态图像或视频。物体通过镜头生成光学图像投射到感光元件。感光元件可以是电荷耦合器件(charge coupled device,CCD)或互补金属氧化物半导体(complementary metal-oxide-semiconductor,CMOS)光电晶体管。感光元件把光信号转换成电信号,之后将电信号传递给ISP转换成数字图像信号。ISP将数字图像信号输出到DSP加工处理。DSP将数字图像信号转换成标准的RGB,YUV等格式的图像信号,应理解,在本申请实施例的描述中,以RGB格式的图像为例进行介绍,本申请实施例对图像格式不做限定。在一些实施例中,电子设备100可以包括1个或N个摄像头193,N为大于1的正整数。
数字信号处理器用于处理数字信号,除了可以处理数字图像信号,还可以处理其他数字信号。例如,当电子设备100在频点选择时,数字信号处理器用于对频点能量进行傅里叶变换等。
视频编解码器用于对数字视频压缩或解压缩。电子设备100可以支持一种或多种视频编解码器。这样,电子设备100可以播放或录制多种编码格式的视频,例如:动态图像专家组(moving picture experts group,MPEG)1,MPEG2,MPEG3,MPEG4等。
外部存储器接口120可以用于连接外部存储卡,例如Micro SD卡,实现扩展电子设备100的存储能力。内部存储器121可以用于存储计算机可执行程序代码,所述可执行程序代码包括指令。处理器110通过运行存储在内部存储器121的指令,从而执行电子设备100的各种功能应用以及数据处理。内部存储器121可以包括存储程序区和存储数据区。
电子设备100可以通过音频模块170,扬声器170A,受话器170B,麦克风170C,耳机接口170D,以及应用处理器等实现音频功能。例如音乐播放,录音等。
压力传感器180A用于感受压力信号,可以将压力信号转换成电信号。陀螺仪传感器180B可以用于确定电子设备100的运动姿态。气压传感器180C用于测量气压。磁传感器180D包括霍尔传感器。电子设备100可以利用磁传感器180D检测翻盖皮套的开合。加速度传感器180E可检测电子设备100在各个方向上(一般为三轴)加速度的大小。距离传感器180F,用于测量距离。电子设备100可以通过红外或激光测量距离。在一些实施例中,拍摄场景,电子设备100可以利用距离传感器180F测距以实现快速对焦。接近光传感器180G可以包括例如发光二极管(LED)和光检测器,例如光电二极管。环境光传感器180L 用于感知环境光亮度。电子设备100可以根据感知的环境光亮度自适应调节显示屏194亮度。环境光传感器180L也可用于拍照时自动调节白平衡。指纹传感器180H用于采集指纹。电子设备100可以利用采集的指纹特性实现指纹解锁,访问应用锁,指纹拍照,指纹接听来电等。温度传感器180J用于检测温度。触摸传感器180K,也称“触控面板”。触摸传感器180K可以设置于显示屏194,由触摸传感器180K与显示屏194组成触摸屏,也称“触控屏”。触摸传感器180K用于检测作用于其上或附近的触摸操作。骨传导传感器180M可以获取振动信号。音频模块170可以基于所述骨传导传感器180M获取的声部振动骨块的振动信号,解析出语音信号,实现语音功能。
按键190包括开机键,音量键等。按键190可以是机械按键。也可以是触摸式按键。电子设备100可以接收按键输入,产生与电子设备100的用户设置以及功能控制有关的键信号输入。马达191可以产生振动提示。马达191可以用于来电振动提示,也可以用于触摸振动反馈。例如,作用于不同应用(例如拍照,音频播放等)的触摸操作,可以对应不同的振动反馈效果。作用于显示屏194不同区域的触摸操作,马达191也可对应不同的振动反馈效果。指示器192可以是指示灯,可以用于指示充电状态,电量变化,也可以用于指示消息,未接来电,通知等。SIM卡接口195用于连接SIM卡。
电子设备100的软件系统可以采用分层架构,事件驱动架构,微核架构,微服务架构,或云架构。本申请实施例以分层架构的Android系统为例,示例性说明电子设备100的软件结构。
图2是本申请实施例的电子设备100的软件结构框图。分层架构将软件分成若干个层,每一层都有清晰的角色和分工。层与层之间通过软件接口通信。在一些实施例中,将Android系统分为四层,从上至下分别为应用程序层,应用程序框架层,安卓运行时(Android runtime)和系统库,以及硬件抽象层(hardware abstract layer,HAL)。
应用程序层可以包括一系列应用程序包。如图2所示,应用程序包可以包括相机、相册、音乐、设置等应用程序。
应用程序框架层为应用程序层的应用程序提供应用编程接口(application programming interface,API)和编程框架。应用程序框架层包括一些预先定义的函数。如图2所示,应用程序框架层可以包括窗口管理器,内容提供器,视图系统,资源管理器,通知管理器等。
窗口管理器用于管理窗口程序。窗口管理器可以获取显示屏大小,判断是否有状态栏,锁定屏幕,截取屏幕等。
内容提供器用来存放和获取数据,并使这些数据可以被应用程序访问。所述数据可以包括视频,图像,音频,拨打和接听的电话,浏览历史和书签,电话簿等。
视图系统包括可视控件,例如显示文字的控件,显示图片的控件等。视图系统可用于构建应用程序。显示界面可以由一个或多个视图组成的。例如,包括短信通知图标的显示界面,可以包括显示文字的视图以及显示图片的视图。
资源管理器为应用程序提供各种资源,比如本地化字符串,图标,图片,布局文件,视频文件等等。
通知管理器使应用程序可以在状态栏中显示通知信息,可以用于传达告知类型的消息,状态栏中显示通知信息可以短暂停留后自动消失,例如用于告知用户下载完成的消息提醒等。通知管理器还可以是以图表或者滚动条文本形式出现在系统顶部状态栏的通知, 例如后台运行的应用程序的通知,还可以是以对话窗口形式出现在屏幕上的通知。例如在状态栏提示文本信息,或者通知管理器还可以发出提示音,例如电子设备振动,指示灯闪烁等。
Android runtime包括核心库和虚拟机。Android runtime负责安卓系统的调度和管理。核心库包含两部分:一部分是java语言需要调用的功能函数,另一部分是安卓的核心库。
应用程序层和应用程序框架层运行在虚拟机中。虚拟机将应用程序层和应用程序框架层的java文件执行为二进制文件。虚拟机用于执行对象生命周期的管理,堆栈管理,线程管理,安全和异常的管理,以及垃圾回收等功能。
系统库可以包括多个功能模块。例如:表面管理器(surface manager),媒体库(media libraries),三维图形处理库(例如:OpenGL ES),2D图形引擎(例如:SGL)等。
表面管理器用于对显示子系统进行管理,并且为多个应用程序提供了2D和3D图层的融合。
媒体库支持多种常用的音频,视频格式回放和录制,以及静态图像文件等。媒体库可以支持多种音视频编码格式,例如:MPEG4,H.264,MP3,AAC,AMR,JPG,PNG等。
三维图形处理库用于实现三维图形绘图,图像渲染,合成,和图层处理等。2D图形引擎是2D绘图的绘图引擎。
HAL可以为内核层的一部分,或者,HAL为独立为一层,位于内核层和系统库之间,是硬件和软件之间的层。其中,HAL可以包含硬件驱动模块,例如显示驱动,摄像头驱动、传感器驱动等,应用程序框架层可以调用HAL的硬件驱动模块。
在本申请实施例介绍的长焦拍摄过程中,用户打开相机应用,图2中应用程序层的相机应用启动,并向HAL发送指令,以调动HAL的摄像头驱动,传感器驱动和显示驱动,使得电子设备可以启动摄像头或镜头采集图像。在摄像头采集图像过程中,光线通过摄像头被传递到图像传感器,图像传感器针对光信号进行光电转换,转化为用户肉眼可见的图像。输出的图像数据以数据流的形式传递给图2中的系统库,由三维图形处理库和图像处理库实现绘图、图像渲染、合成和图层处理等,生成显示图层;由表面管理器将显示图层进行融合处理等,传递给应用程序框架层的内容提供器、窗口管理器和视图系统,控制显示界面的显示。最终,该预览图像显示在相机应用的图像预览区域或者电子设备的显示屏上。
为了便于理解,本申请以下实施例将以具有图1和图2所示结构的电子设备为例,介绍电子设备的拍照过程。
图3是一例电子设备拍照过程的控制结构示意图,该控制结构300包括镜头310、图像传感器320和图像信号处理(image signal processing,ISP)模块330。
其中,镜头310可以对应于图1中电子设备100的摄像头193,用于获取图像。示例性的,摄像头193可以体现为一个或多个不同的镜头,如广角镜头、主镜头、长焦镜头、飞行时间(time of flight,TOF)镜头等,本申请实施例对镜头的形式和数量不做限定。在本申请实施例中,以电子设备具有长焦定焦镜头为例,介绍电子设备通过长焦定焦镜头拍摄长焦照片的过程。
图像传感器320是一种半导体芯片,表面包含有几十万到几百万的光电二极管,受到光照射时,会产生电荷,通过模数转换器转换成数字信号。图像传感器320可以是电荷耦 合元件(charge coupled device,CCD),也可以是互补金属氧化物导体器件(complementary metal-oxide semiconductor,CMOS)。CCD可以使用一种高感光度的半导体材料制成,能把光线转变成电荷,通过模数转换器芯片转换成数字信号。CCD由许多感光单位组成,通常以百万像素为单位。当CCD表面受到光线照射时,每个感光单位会将电荷反映在组件上,所有的感光单位所产生的信号加在一起,就构成了一幅完整的画面,即将光学图像转化为原始图像。在一些实施例中,图像传感器320也可以称为感光芯片、感光元件等。
ISP模块330可以处理原始图像,通过一系列复杂的数学算法运算,对数字图像信号进行优化处理,最后把处理后的信号传到电子设备的显示器上,即显示图像。ISP模块330可以对应体现为单独的图像处理芯片或数字信号处理芯片(digital signal processing,DSP),或者ISP模块330可以是对应于图1中示出的电子设备100中的处理器110的功能模块,可以包括逻辑部分以及运行在其上的固件程序(firmware),以实现将感光芯片获得的数据及时快速地传递给处理器110并刷新感光芯片。此外,ISP模块330还可以对图像的噪点、亮度、肤色进行算法优化,以及对拍摄场景的曝光参数、色温等参数进行优化。
具体地,在拍摄过程中,用户打开相机,光线通过镜头310被传递到图像传感器320上,换言之,镜头310可以将环境光信号投射到图像传感器320的感光区域后,图像传感器320经过光电转换,转化为肉眼可见的图像。再将内部原始图像(Bayer格式)传送给ISP模块330,ISP模块330经过算法处理,输出RGB空间域的图像给后端的采集单元,显示在电子设备的图像预览区域或者电子设备的显示屏上。在这个过程中,处理器110通过运行在其上的固件程序(firmware)对镜头310、图像传感器320和ISP模块330,进行相应控制,进而完成图像预览或者拍摄功能。
下面结合附图和应用场景,对本申请实施例提供的长焦拍摄方法进行具体阐述。
图4是本申请实施例提供的一例长焦拍摄过程的图形用户界面(graphical user interface,GUI)示意图,本申请将以手机为例,详细介绍本申请提供的长焦拍摄方法。其中,图4中的(a)图示出了手机的解锁模式下,手机的屏幕显示系统显示了当前输出的界面内容401,该界面内容401为手机的主界面。该界面内容401显示了多款应用程序(application,App),例如相册、设置、音乐、相机等应用程序。应理解,界面内容401还可以包括其他更多的应用程序,本申请对此不作限定。
如图4中的(a)图所示,用户点击相机应用的图标,响应于用户的点击操作,手机进入如(b)图所示的相机应用的预览界面402。在该相机应用的预览界面402上,包括多个菜单区域,每个区域包括不同的按键,分别用于实现相机应用的各种不同功能。示例性的,如图4中的(b)图所示,相机应用的预览界面402的顶端菜单区域10、放大倍数调节区域30、图像预览显示区域40和底端菜单区域,应理解,本申请实施例对各个区域的划分方式不做限定。各个区域可以相邻、相离或者重叠等,本申请实施例为了描述方便,对相机应用的预览界面402的区域做如图虚线框所示的划分。
其中,顶端菜单区域10包括多个按键,以满足用户的不同拍摄需求,例如闪光灯按键、人工智能(artificial intelligence,AI)识别按键等,此处不再赘述。放大倍数调节区域30用于显示拍摄过程的放大倍数,用户可以滑动该放大倍数调节区域30放大倍数,从而通过改变手机的拍摄焦距或拍摄镜头改变针对拍摄目标的预览图像。
示例性的,图(b)中显示“1×”表示通过手机主镜头(如焦距27毫米)获取预览图 像,“0.7×”表示通过手机广角镜头(如焦距16毫米)获取预览图像,“5×”表示通过手机长焦镜头(如焦距125毫米)获取预览图像,“5×”以上的放大倍数(如“25×”、“50×”等)表示当前通过数字变焦获取图像。示例性的,“50×”获取的预览图像是从“5×”时手机的长焦镜头的原始取景画面的中心,进行裁切等处理后得到的,因此,“50×”的取景面积仅占“5×”时手机取景面积的1/100。
如图4中的(b)图所示,相机应用的预览界面402当前显示的是“1×”时的取景:一座长满树木的山峰。该山峰在放大倍数为“1×”时,可以看到整座山峰的全貌。当用户期望的拍摄目标为图(b)中虚线框20内的山峰顶端的树木时,可以调节相机拍摄的放大倍数。用户点击预览界面402上顶端菜单区域10的设置图标,响应于用户的点击操作,手机进入如(c)图所示的相机设置界面403。在该相机设置界面403上,包括多个菜单栏,例如分辨率、地理位置、自动添加水印、AI摄影大师、参考线、拍摄声音、笑脸抓拍等,本申请实施例对以上菜单的功能不再赘述。
一种可能的实现方式中,本申请实施例在该相机设置界面403上,为用户提供一个启动长焦拍摄功能的开关——辅助取景画面。示例性的,如图4中的(c)图所示,用户点击辅助取景画面开关,使得该辅助取景画面开关为开启(“ON”)状态,即开启了本申请实施例提供的长焦拍摄功能。当用户开启了长焦拍摄功能,返回到相机应用的预览界面,如果用户调节并增大放大倍数时,如图4中的(d)图所示,该预览界面404上,还显示辅助取景框50,该辅助取景框50可以显示图(b)中“5×”时的取景画面,图像预览显示区域40显示放大后的取景画面。
应理解,本申请实施例介绍的是手机长焦拍摄的过程,当放大倍数为“5×”时,相机进入长焦拍摄模式,即通过长焦镜头取景,且随着放大倍数的增加,都是通过该长焦镜头取景,不同放大倍数的预览图像是在“5×”时对该长焦镜头的取景画面进行裁切等处理后获得的,本申请实施例对“5×”以下的拍摄过程以及获取预览图像的过程不做限定。
可选地,当用户开启了辅助取景画面开关,返回到相机应用的预览界面404,该辅助取景框50就持续显示在预览界面404上。如果此时用户还没有调节放大倍数,即放大倍数为“1×”时,该辅助取景框50可以和图像预览显示区域40中具有相同的预览图像,都为手机的主镜头获取的取景画面。当用户调节并增大放大倍数至“50×”后,该辅助取景框50可以显示图(b)中“5×”时的取景画面,图像预览显示区域40显示放大至“50×”后的取景画面。
可选地,当用户开启了辅助取景画面开关,返回到相机应用的预览界面404,不显示该辅助取景框50。当用户调节并增大放大倍数,使放大倍数大于或等于一定门限(例如“10×”)时,该辅助取景框50才显示在预览界面404上。示例性的,用户开启了辅助取景画面开关,返回到相机应用的预览界面404,当前放大倍数为“1×”时,该辅助取景框50不显示,当用户滑动调节放大倍数至“10×”时,该辅助取景框50显示在预览界面404上,且该辅助取景框50可以显示图(b)中“5×”时的取景画面,图像预览显示区域40显示放大“10×”后的取景画面。
在另一种可能的实现方式中,可以将启动长焦拍摄功能的开关——辅助取景画面开关设置在相机应用的预览界面402的任意位置。例如,辅助取景画面开关设置在顶端菜单区域10中,用户不需要通过设置按键开启长焦拍摄功能,直接点击在顶端菜单区域10的辅 助取景画面开关,就可以启动长焦拍摄功能,显示辅助取景框50。
在另一种可能的实现方式中,该长焦拍摄功能默认为开启状态,不在相机应用中设置辅助取景画面开关,当判断当前的拍摄满足预设条件时,相机应用的预览界面402上自动显示辅助取景框50。
可选地,预设条件可以是判断用户滑动调节放大倍数大于或者等于“10×”。示例性的,当用户滑动调节放大倍数等于“10×”时,在相机应用的预览界面402上自动显示辅助取景框50,且该辅助取景框50可以显示图(b)中“5×”时的取景画面,图像预览显示区域40显示放大“10×”后的取景画面。
应理解,在本申请实施例中,提供的该辅助取景框50可以用于显示在“5×”时,由手机的长焦镜头获取的取景画面,图像预览显示区域40显示用户调节放大倍数后的取景画面。在长焦拍摄中,该方法可以在图像预览显示区域40同时显示两个取景框,且两个取景框为用户提供具有两个不同视场角的预览画面。对于具有定焦长焦镜头且镜头轴心不可旋转的电子设备,该方法可以让用户在长焦拍摄过程中更容易找到拍摄目标的主体。
一种可能的实现方式中,该辅助取景框50包括关闭按键70,该关闭按键70可以位于该辅助取景框50的右上角位置,用户可以通过点击该关闭按键70关闭该辅助取景框50。例如,当用户完成对焦,拍摄目标的主体显示在图像预览区域40时,用户可以关闭该辅助取景框50,从而可以有更大的图像预览区域,方便用户查看图像预览画面。
一种可能的实现方式中,该辅助取景框50包括目标区域,该目标区域用于确定拍摄目标主体。示例性的,如图4中的(d)图所示,在该辅助取景框50包括目标区域60,该目标区域60用于确定用户期望拍摄的“山峰顶端的树木”。当用户将放大倍数调节至“50×”的长焦拍摄时,用户通过辅助取景框50,调整手机镜头的角度等将“山峰顶端的树木”移动到该目标区域60中,则可以保证图像预览显示区域40呈现的是放大“50×”的山峰顶端的树木。
在放大“50×”进行拍摄时,手机的FOV仅为0.5度左右,用户轻微转动手机角度或者拍摄抖动等,将会放大该抖动和偏移,导致图像预览显示区域40的画面大幅偏移,拍摄目标将偏移到图像预览显示区域40,此时用户很难找回拍摄目标。通过该辅助取景框50,用户可以直观的了解和调整手动调整手机镜头的角度等的操作,将“山峰顶端的树木”移动到该目标区域60中,以使拍摄目标“山峰顶端的树木”显示到图像预览显示区域40中。该方法因此能提高操作效率。
可选地,该辅助取景框50显示在手机的图像预览显示区域40的左下角,且该目标区域60可以位于辅助取景框50的中心位置。示例性的,如图4中的(d)图所示,该辅助取景框50位于图像预览显示区域40的左下角,目标区域60可以位于辅助取景框50的中心区域,该显示方式可以减少该辅助取景框50对图像预览显示区域40的遮挡,且方便用户快速将“山峰顶端的树木”显示到图像预览显示区域40中。
一种可能的实现方式中,该辅助取景框50可以根据用户的操作,在图像预览显示区域40移动。图5是本申请实施例提供的又一例长焦拍摄过程的图形用户界面示意图。示例性的,如图5中的(a)图所示,用户点击该辅助取景框50的任意位置,并沿着向上方向拖动该辅助取景框50,响应于用户的拖动操作,该辅助取景框50从图像预览显示区域40的左下角移动到左上角。
通过上述方法,根据当前图像预览显示区域40中显示的图像,用户可以移动该辅助取景框50,最大限度的降低该辅助取景框50对图像预览显示区域40显示的图像的遮挡。
可选地,目标区域60在该辅助取景框50的位置也可以根据用户的操作进行移动。示例性的,当用户用三脚架等装置固定手机时,手机的拍摄角度已经固定,使得在该辅助取景框50中,拍摄目标没有位于该辅助取景框50的中心区域,如图5中的(b)图所示,在该辅助取景框50中,拍摄目标“山峰顶端的树木”显示到该辅助取景框50的左下区域,此时在图像预览显示区域40无法显示山峰顶端的树木的预览图像。用户可以执行如图5中的(b)图所示的操作,点击目标区域60的任意位置,并将该目标区域60向左下方拖动,使得该目标区域60中包括该拍摄目标——“山峰顶端的树木”。响应于用户的拖动操作,该目标区域60在辅助取景框50中的位置改变,当目标区域60包括“山峰顶端的树木”时,图像预览显示区域40可以显示用户期望拍摄的山峰顶端的树木。
应理解,在同时显示图像预览区域40和辅助取景框50时,该图像预览区域40的画面是对辅助取景框50中目标区域60的画面进行放大之后得到的。示例性的,以图4中的(d)图示出的“50×”的长焦拍摄为例,辅助取景框50中为手机在“5×”时长焦镜头获取的画面,相机应用可以确定该辅助取景框50中的目标区域60的位置,并确定该目标区域60圈定的画面的范围,进而确定该目标区域60的画面所包含的图像数据。相机应用运行在处理器110上,即处理器110可以获知该目标区域60的位置信息,处理器110将该目标区域60的位置信息发送给ISP模块330,由ISP模块330根据目标区域60的位置信息对“5×”时长焦镜头获取的画面进行裁切处理,得到新的画面,并进一步将该新的画面放大“50×”的处理之后的图像数据传递给HAL和应用程序层进行显示,最终在图像预览区域40中显示目标区域60的画面放大“50×”之后的画面。同理,不论该目标区域60为固定区域,或者用户可以手动拖动的区域,都可以由相机应用确定该目标区域60的位置信息,进一步对该目标区域60包括的画面进行放大处理等,此处不再赘述。
通过上述方法,用户可以根据当前的拍摄需求,在该辅助取景框50中移动目标区域60,在不改变手机或者镜头的拍摄角度的情况下,便于用户快速找到拍摄目标。
以上结合图4和图5介绍了多种启动长焦拍摄功能在相机应用的预览界面404上显示辅助取景框50的方法,下面结合图6至图7介绍该长焦拍摄的内部实现过程。
图6是本申请实施例提供的一例长焦拍摄过程的实现流程图。如图6所示,该方法600的实现过程可以分为三个阶段,分别包括:ISP处理阶段、HAL处理阶段和应用层处理阶段。结合图2的软件架构和图3的拍照过程的控制结构,该ISP处理阶段可以由ISP模块330执行;HAL处理阶段和应用层处理阶段由处理器110来执行,HAL处理阶段对应在HAL执行,应用层处理阶段在应用程序层执行。具体地,方法600包括:
601,手机进入长焦拍摄模式。
应理解,本申请实施例中介绍的“长焦拍摄模式”可以理解为当前的预览图像是由手机的长焦镜头获取的。例如,手机在“5×”以上的放大倍数时,可以通过手机的长焦镜头的获取原始取景画面,“25×”、“50×”等放大倍数时的预览图像是从“5×”获取的原始取景画面的中心开始,进行裁切等处理后得到的。
还应理解,本申请实施例中,长焦拍摄模式不限于手机处于拍照模式,或者录制视频的模式。换言之,手机可以在长焦拍摄模式下,可以拍摄照片,或者录制视频。
可选地,用户可以通过滑动手机放大倍数调节区域30的放大倍数,当放大倍数达到第一阈值(例如“5×”)时,手机自动切换镜头进入长焦拍摄模式,通过手机的长焦镜头的获取原始取景画面。或者,还可以通过用户设置等其他方式进入长焦拍摄模式,本申请实施例对此不做限定。
602,手机的ISP模块330生成第一预览流。
结合图2和图3中的相关介绍,该第一预览流还可以称为“原始预览流”,可以理解为:镜头310获取的光信号依次经过图像传感器320、ISP模块330算法处理,输出的RGB空间域的图像数据。该图像数据可以是以数据流的形式,被发送给手机的显示单元,例如发送图2的系统库中表面管理器、三维图形处理库等,完成图像渲染,合成和图层处理等,最终显示在手机显示屏的图像预览区域。
603,手机的ISP模块330确定当前是否是数字变焦。
应理解,在本申请实施例中,数字变焦可以理解为:如果在长焦拍摄过程中,获取的图像是对一定放大倍数(例如“5×”)获取的原始取景画面进行裁切等处理后得到的,则为数字变焦过程。
可选地,手机的ISP模块330确定当前是否是数字变焦可以根据当前的放大倍数是否大于或者等于第二阈值。应理解,该第二阈值可以等于或者大于前述第一阈值。当第二阈值等于第一阈值时,示例性的,第二阈值和第一阈值都为“5×”时,即手机进入长焦拍摄模式时候,获取图像的过程为数字变焦过程。当第二阈值大于第一阈值时,示例性的,第一阈值都为“5×”,第二阈值为“10×”时,即在放大倍数为“5×”时,手机进入长焦拍摄模式;在放大倍数为“10×”时,获取图像的过程为数字变焦过程。
604,当手机的ISP模块330确定当前获取图像的方式不是数字变焦过程,将第一预览流作为主预览流,仅向HAL发送第一预览流。
应理解,这里“主预览流”可以理解为:用于显示在图像预览显示区域40的RGB空间域的图像数据;同理,在本申请中,可以将显示在辅助取景框50的RGB空间域的图像数据称为“辅助预览流”。
以上步骤601-604为ISP模块330完成的ISP处理流程,ISP模块330将第一预览流发送到HAL。
605,在手机的HAL对主预览流进行画质优化处理,并将处理过后的主预览流图像数据发送给应用程序层。
具体地,手机的HAL可以接收处理器110的控制,为第一预览流叠加画质优化算法,例如色温调整、降噪处理、平滑处理、白平衡处理等优化算法,本申请实施例对优化算法的种类和方法不做限定。
606,在手机的应用程序层相机应用在图像预览区域绘制主预览画面。
该步骤606的过程可以理解为,手机的ISP模块330将第一预览流的图像数据传递给处理器110,处理器110控制由图2的系统库中表面管理器、三维图形处理库等,完成图像渲染,合成和图层处理等,最终显示在手机显示屏的图像预览区域。
607,手机的相机应用确定当前是否满足辅助预览条件。
该步骤607的判断过程可以在ISP模块330和应用程序层达成共识,ISP模块330确定当前获取图像的方式不是数字变焦过程,不满足辅助预览条件,相机应用可以确定不满 足辅助预览条件,或者,当ISP模块330确定当前获取图像的方式不是数字变焦过程时,相机应用可以不执行该步骤607。
608,图像预览界面绘制完成。
以上步骤601-604为ISP模块330完成的ISP处理流程,换言之,如果该第一预览流经过步骤605的HAL处理阶段处理,并经过步骤606在图像预览区域显示主预览画面,即可以在相机应用的图像预览区域输出用户可见的图像。以上为手机正常显示“1×”预览画面的全部过程。
609,如果在步骤603中,手机的ISP模块330确定当前获取图像的方式是数字变焦过程时,ISP模块330生成第二预览流。
可选地,当前的放大倍数大于或者等于第二阈值,例如大于“5×”时,手机的ISP模块330确定当前获取图像的方式是数字变焦过程。
该第二预览流还可以称为“数字变焦预览流”,可以理解为:以“5×”时获取的原始取景画面的中心开始,进行裁切等处理后的取景画面,对应输出的RGB空间域的图像数据。例如,用户调节放大倍数为“10×”进行拍摄时,第二预览流为“10×”时的取景画面对应输出的图像数据。
610,手机的ISP模块330确定当前是否满足辅助预览条件。
可选地,该步骤610的判断过程可以是:用户是否开启了辅助取景画面开关。示例性的,如图4中的(b)图和(c)图介绍的方法,用户手动开启辅助取景画面开关,即手机的ISP模块330确定当前满足辅助预览条件。
或者,该长焦拍摄功能默认为开启状态,用户可以不通过手动开启辅助取景画面开关,当手机判断当前的拍摄满足预设条件时,手机的ISP模块330确定当前满足辅助预览条件。例如该预设条件可以是手机判断当前拍摄放大倍数大于或者等于“10×”。
当满足上述任意一种情况,手机的ISP模块330确定当前满足辅助预览条件,相机应用的预览界面上自动显示辅助取景框50,到步骤612;否则,手机的ISP模块330确定当前不满足辅助预览条件,相机应用的预览界面上自动显示辅助取景框50,到步骤611。
611,当手机的ISP模块330确定当前不满足辅助预览条件,手机的ISP模块330将第二预览流作为主预览流,仅发送该第二预览流。
之后继续执行步骤605、步骤606和步骤608,仅将第二预览流作为主预览流,经过HAL对第二预览流进行画质优化处理,该处理后的数据流被发送给手机的显示单元,在手机的应用程序层相机应用的图像预览区域显示主预览画面,即在相机应用的图像预览区域输出用户可见的图像。以用户在放大倍数“9×”拍摄为例,以上为手机显示“9×”时的放大后的预览画面的全部过程。
612,当手机的ISP模块330确定当前满足辅助预览条件,手机的ISP模块330将第二预览流作为主预览流,将第一预览流作为辅助预览流,同时向HAL发送该第一预览流和第二预览流。
应理解,当手机的ISP模块330确定当前满足辅助预览条件,如图4中的(d)图所示,或者如图5中的(a)图和(b)图所示,相机应用的预览界面上可以自动显示辅助取景框50。示例性的,以用户在放大倍数“50×”时的拍摄为例,如图4中的(d)图所示,手机的图像预览显示区域40显示“50×”时的放大后的预览画面,该预览画面对应的是“第 二预览流”(或者称为“主预览流”)包含的图像数据;手机的辅助取景框50显示“5×”时的原始取景画面,该辅助取景框50的原始取景画面对应的是“第一预览流”(或者称为“辅助预览流”)包含的图像数据。手机的ISP模块330将第一预览流和第二预览流同时发送给HAL,即实现双路预览流的图像数据上报。该双路预览流的图像数据上报过程可以同时发送,应理解,本申请实施例对双路预览流的图像数据上报过程不做限定。
613,在手机的HAL对第一预览流和第二预览流进行画质优化处理。该过程可以参照前述步骤605的相关描述,此处不再赘述。
606,在手机的应用程序层相机应用在图像预览区域绘制主预览画面。
607,手机的相机应用确定当前是否满足辅助预览条件。
该步骤607的判断过程可以参照步骤610的ISP模块330的判断过程,该判断过程可以在ISP模块330和应用程序层达成共识,在图6中用虚线框示出。应理解,该步骤607可以和步骤610保持一致,相机应用可以不执行。
例如,当ISP模块330在步骤610中判断当前满足辅助预览条件,相机应用也可以知道当前满足辅助预览条件,从而通过步骤606,根据处理过后的第二预览流的图像数据发送给应用程序层,绘制主预览画面,即在图像预览显示区域40显示“50×”时的放大后的预览画面;同时通过步骤614,根据处理过后的第一预览流的图像数据发送给应用程序层,绘制辅助预览画面,即在辅助取景框50显示“5×”时的原始取景画面。
或者,当ISP模块330在步骤610中判断当前不满足辅助预览条件,相机应用也可以知道当前不满足辅助预览条件,从而通过步骤606,根据处理过后的第二预览流的图像数据发送给应用程序层,绘制主预览画面,即在图像预览显示区域40显示“50×”时的放大后的预览画面,而不显示辅助取景框50。
经过以上介绍的步骤601-603、609-610、612-613、606、607、614和608的过程,可以实现在图像预览区域显示放大“50×”后的主预览画面,同时通过辅助取景框50显示“5×”时的原始取景画面。示例性的,如图4中的(d)图所示,辅助取景框50用于显示在“5×”时,由手机的长焦镜头获取的原始取景画面——整座山峰,图像预览显示区域40显示用户调节放大倍数“50×”后的取景画面——山峰顶端的树木。在长焦拍摄中,该方法可以在图像预览显示区域40同时显示两个取景框,且两个取景框为用户提供具有两个不同视场角的预览画面。对于具有定焦长焦镜头且镜头轴心不可旋转的电子设备,该方法可以让用户在长焦拍摄过程中更容易找到拍摄目标的主体。
对于该辅助取景框50中目标区域60的显示,可以有相机应用控制,当手机检测到辅助取景框50中的拍摄目标的主体偏移中心区域时,可以自动显示该目标区域60,用于用户调整手机镜头的角度等将拍摄目标的主体移动到该目标区域60中,更容易找到拍摄目标的主体。
此外,对于辅助取景框50在显示屏上的位置坐标,可以通过多种方式确定。
图7是本申请实施例提供的一例手机界面的坐标示意图,如7所示,对于手机的显示屏,以显示屏的左上角的起点O0为坐标原点,分别用水平方向(X轴方向)和垂直方向(Y轴方向)的像素来表征显示屏的尺寸大小,手机显示屏尺寸为640×960,表示在X轴方向上包含640个像素,在Y轴方向上包含960个像素。在后续的坐标的描述中,数字都为该点像素的位置坐标。应理解,在某个方向上占据的像素个数越多,显示效果就越精 细和细腻。
一种可能的实现方式中,辅助取景框50在显示屏上的位置坐标可以是相机应用内部预设的坐标。示例性的,手机可以为辅助取景框50指定初始顶点坐标O、X轴方向包含的像素数x和Y轴方向包含的像素数y。例如初始顶点坐标O(0,320)、X轴方向包含的像素数为100,Y轴方向包含的像素150之后,可以使得辅助取景框50显示在手的显示屏的图像预览区域40的右下角位置,显示如图4中(d)图所示。
可选地,手机100可以根据手机相机应用的图像预览区域40当前显示的画面,确定在图像预览区域40上显示的辅助取景框50的坐标和尺寸大小。示例性的,当手机检测到图像预览区域40的取景画面中拍摄目标——山顶的树木靠近手机右边,左边除了拍摄目标之外的留白区域大于右边区域,手机为辅助取景框50设置的初始显示在如图5中的(a)图所示左上角区域。该显示方式可以不遮挡右边的放大倍数调节区域30,方便用户操作。
应理解,如图5中的(a)图介绍的方法,这里手机的辅助取景框50可以接收用户的拖动操作,在图像预览显示区域40上进行移动。当辅助取景框50移动后,相机应用可以重新确定辅助取景框50的新坐标,并更新辅助取景框50的坐标信息,以保证准确的将第一预览流包括的图像数据显示在该辅助取景框50中。
同理,如图5中的(b)图所示,辅助取景框50中的目标区域60也可以接受用户的拖动操作,变化后的目标区域60的坐标信息以及该目标区域60圈定的拍摄目标的主体都可以被相机获取,从而在数字变焦处理过程中,以获取的原始取景画面进行裁切等处理时,准确的对目标区域60圈定区域的拍摄目标的主体进行处理,进一步生成第二预览流,此处不再赘述。
通过上述方法,用户可以根据当前的拍摄需求,在该辅助取景框50中移动目标区域60,在不改变手机或者镜头的拍摄角度的情况下,便于用户快速找到拍摄目标。
可以理解的是,电子设备为了实现上述功能,其包含了执行各个功能相应的硬件和/或软件模块。结合本文中所公开的实施例描述的各示例的算法步骤,本申请能够以硬件或硬件和计算机软件的结合形式来实现。某个功能究竟以硬件还是计算机软件驱动硬件的方式来执行,取决于技术方案的特定应用和设计约束条件。本领域技术人员可以结合实施例对每个特定的应用来使用不同方法来实现所描述的功能,但是这种实现不应认为超出本申请的范围。
本实施例可以根据上述方法示例对电子设备进行功能模块的划分,例如,可以对应各个功能划分各个功能模块,例如检测单元、处理单元、显示单元等,也可以将两个或两个以上的功能集成在一个处理模块中。上述集成的模块可以采用硬件的形式实现。需要说明的是,本实施例中对模块的划分是示意性的,仅仅为一种逻辑功能划分,实际实现时可以有另外的划分方式。
需要说明的是,上述方法实施例涉及的各步骤的所有相关内容均可以援引到对应功能模块的功能描述,在此不再赘述。
本实施例提供的电子设备,用于执行上述长焦拍摄的方法,因此可以达到与上述实现方法相同的效果。
在采用集成的单元的情况下,电子设备还可以包括处理模块、存储模块和通信模块。其中,处理模块可以用于对电子设备的动作进行控制管理。存储模块可以用于支持电子设 备执行存储程序代码和数据等。通信模块,可以用于支持电子设备与其他设备的通信。
其中,处理模块可以是处理器或控制器。其可以实现或执行结合本申请公开内容所描述的各种示例性的逻辑方框,模块和电路。处理器也可以是实现计算功能的组合,例如包含一个或多个微处理器组合,数字信号处理(digital signal processing,DSP)和微处理器的组合等等。存储模块可以是存储器。通信模块具体可以为射频电路、蓝牙芯片、Wi-Fi芯片等与其他电子设备交互的设备。
在一个实施例中,当处理模块为处理器,存储模块为存储器时,本实施例所涉及的电子设备可以为具有图1所示结构的设备。
本实施例还提供一种计算机存储介质,该计算机存储介质中存储有计算机指令,当该计算机指令在电子设备上运行时,使得电子设备执行上述相关方法步骤实现上述实施例中的长焦拍摄的方法。
本实施例还提供了一种计算机程序产品,当该计算机程序产品在计算机上运行时,使得计算机执行上述相关步骤,以实现上述实施例中的长焦拍摄的方法。
另外,本申请的实施例还提供一种装置,这个装置具体可以是芯片,组件或模块,该装置可包括相连的处理器和存储器;其中,存储器用于存储计算机执行指令,当装置运行时,处理器可执行存储器存储的计算机执行指令,以使芯片执行上述各方法实施例中的长焦拍摄的方法。
其中,本实施例提供的电子设备、计算机存储介质、计算机程序产品或芯片均用于执行上文所提供的对应的方法,因此,其所能达到的有益效果可参考上文所提供的对应的方法中的有益效果,此处不再赘述。
通过以上实施方式的描述,所属领域的技术人员可以了解到,为描述的方便和简洁,仅以上述各功能模块的划分进行举例说明,实际应用中,可以根据需要而将上述功能分配由不同的功能模块完成,即将装置的内部结构划分成不同的功能模块,以完成以上描述的全部或者部分功能。
在本申请所提供的几个实施例中,应该理解到,所揭露的装置和方法,可以通过其它的方式实现。例如,以上所描述的装置实施例仅仅是示意性的,例如,模块或单元的划分,仅仅为一种逻辑功能划分,实际实现时可以有另外的划分方式,例如多个单元或组件可以结合或者可以集成到另一个装置,或一些特征可以忽略,或不执行。另一点,所显示或讨论的相互之间的耦合或直接耦合或通信连接可以是通过一些接口,装置或单元的间接耦合或通信连接,可以是电性,机械或其它的形式。
作为分离部件说明的单元可以是或者也可以不是物理上分开的,作为单元显示的部件可以是一个物理单元或多个物理单元,即可以位于一个地方,或者也可以分布到多个不同地方。可以根据实际的需要选择其中的部分或者全部单元来实现本实施例方案的目的。
另外,在本申请各个实施例中的各功能单元可以集成在一个处理单元中,也可以是各个单元单独物理存在,也可以两个或两个以上单元集成在一个单元中。上述集成的单元既可以采用硬件的形式实现,也可以采用软件功能单元的形式实现。
集成的单元如果以软件功能单元的形式实现并作为独立的产品销售或使用时,可以存储在一个可读取存储介质中。基于这样的理解,本申请实施例的技术方案本质上或者说对现有技术做出贡献的部分或者该技术方案的全部或部分可以以软件产品的形式体现出来, 该软件产品存储在一个存储介质中,包括若干指令用以使得一个设备(可以是单片机,芯片等)或处理器(processor)执行本申请各个实施例方法的全部或部分步骤。而前述的存储介质包括:U盘、移动硬盘、只读存储器(read only memory,ROM)、随机存取存储器(random access memory,RAM)、磁碟或者光盘等各种可以存储程序代码的介质。
以上内容,仅为本申请的具体实施方式,但本申请的保护范围并不局限于此,任何熟悉本技术领域的技术人员在本申请揭露的技术范围内,可轻易想到变化或替换,都应涵盖在本申请的保护范围之内。因此,本申请的保护范围应以权利要求的保护范围为准。
Claims (16)
- 一种长焦拍摄的方法,其特征在于,应用于包括镜头的电子设备,所述方法包括:显示所述电子设备的相机的拍摄界面,所述拍摄界面包括图像预览区域,所述图像预览区域显示第一预览画面;检测到第一操作,响应于所述第一操作,所述电子设备在所述拍摄界面同时显示所述图像预览区域和辅助预览窗口,所述辅助预览窗口显示第二预览画面;其中,所述第一预览画面和所述第二预览画面是通过所述镜头获取的,且所述第一预览画面是第一放大倍数下的取景画面,所述第二预览画面是第二放大倍数下的取景画面,所述第一预览画面是根据所述第二预览画面进行裁切处理得到的,所述第一放大倍数大于或等于所述第二放大倍数。
- 根据权利要求1所述的方法,其特征在于,所述图像预览区域和所述辅助预览窗口的显示为以下任意一种:所述辅助预览窗口的至少部分区域和所述图像预览区域重叠;或者所述辅助预览窗口显示在所述图像预览区域上之外的位置;或者所述辅助预览窗口位于所述图像预览区域的左下角区域。
- 根据权利要求1或2所述的方法,其特征在于,所述第一操作是用户对放大倍数调节区域的放大倍数的调节操作,所述响应于所述第一操作,所述电子设备在所述拍摄界面显示辅助预览窗口,包括:当检测到所述第一放大倍数大于或者等于第一阈值时,所述电子设备自动在所述拍摄界面显示所述辅助预览窗口。
- 根据权利要求1或2所述的方法,其特征在于,所述第一操作是用户在相机应用开启所述辅助预览窗口的操作。
- 根据权利要求1至3中任一项所述的方法,其特征在于,所述辅助预览窗口还包括关闭按键,所述电子设备在所述拍摄界面显示辅助预览窗口之后,所述方法还包括:若所述电子设备检测到用户对所述关闭按钮的操作,所述电子设备关闭所述辅助预览窗口。
- 根据权利要求1至5中任一项所述的方法,其特征在于,所述辅助预览窗口还包括目标区域,所述第一预览画面的图像是所述目标区域的图像经过处理得到的,以及所述目标区域为所述辅助预览窗口中的固定区域,或者,所述目标区域为所述辅助预览窗口中的任意区域。
- 根据权利要求1至6中任一项所述的方法,其特征在于,所述方法还包括:检测到针对所述辅助预览窗口的拖动操作,响应于所述拖动操作,所述辅助预览窗口从第一位置移动到所述第二位置。
- 一种电子设备,其特征在于,所述电子设备包括:镜头,用于获取待拍摄的画面;一个或多个处理器;存储器;多个应用程序;以及一个或多个程序,其中所述一个或多个程序被存储在所述存储器中,当所述一个或者多个程序被所述处理器执行时,使得所 述电子设备执行以下步骤:显示相机的拍摄界面,所述拍摄界面包括图像预览区域,所述图像预览区域显示第一预览画面;检测到第一操作,响应于所述第一操作,在所述拍摄界面同时显示所述图像预览区域和辅助预览窗口,所述辅助预览窗口显示第二预览画面;其中,所述第一预览画面和所述第二预览画面是通过所述镜头获取的,且所述第一预览画面是第一放大倍数下的取景画面,所述第二预览画面是第二放大倍数下的取景画面,所述第一预览画面是根据所述第二预览画面进行裁切处理得到的,所述第一放大倍数大于或等于所述第二放大倍数。
- 根据权利要求8所述的电子设备,其特征在于,所述图像预览区域和所述辅助预览窗口的显示为以下任意一种:所述辅助预览窗口的至少部分区域和所述图像预览区域重叠;或者所述辅助预览窗口显示在所述图像预览区域上之外的位置;或者所述辅助预览窗口位于所述图像预览区域的左下角区域。
- 根据权利要求8或9所述的电子设备,其特征在于,所述第一操作是用户对放大倍数调节区域的放大倍数的调节操作,当所述一个或者多个程序被所述处理器执行时,使得所述电子设备执行以下步骤:当检测到所述第一放大倍数大于或者等于第一阈值时,自动在所述拍摄界面显示所述辅助预览窗口。
- 根据权利要求8或9所述的电子设备,其特征在于,所述第一操作是用户在相机应用开启所述辅助预览窗口的操作。
- 根据权利要求8至11中任一项所述的电子设备,其特征在于,所述辅助预览窗口还包括关闭按键,当所述一个或者多个程序被所述处理器执行时,使得所述电子设备执行以下步骤:若检测到用户对所述关闭按钮的操作,关闭所述辅助预览窗口。
- 根据权利要求8至12中任一项所述的电子设备,其特征在于,所述辅助预览窗口还包括目标区域,所述第一预览画面的图像是所述目标区域的图像经过处理得到的,以及所述目标区域为所述辅助预览窗口中的固定区域,或者,所述目标区域为所述辅助预览窗口中的任意区域。
- 根据权利要求8至13中任一项所述的电子设备,其特征在于,当所述一个或者多个程序被所述处理器执行时,使得所述电子设备执行以下步骤:检测到针对所述辅助预览窗口的拖动操作,响应于所述拖动操作,所述辅助预览窗口从第一位置移动到所述第二位置。
- 一种计算机存储介质,其特征在于,包括计算机指令,当所述计算机指令在电子设备上运行时,使得所述电子设备执行如权利要求1至7中任一项所述的长焦拍摄的方法。
- 一种计算机程序产品,其特征在于,当所述计算机程序产品在计算机上运行时,使得所述计算机执行如权利要求1至7中任一项所述的长焦拍摄的方法。
Priority Applications (3)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| US17/794,831 US11968447B2 (en) | 2020-01-23 | 2020-11-16 | Long-focus shooting method and electronic device |
| EP20916157.9A EP4087231A4 (en) | 2020-01-23 | 2020-11-16 | TELE-LENS PHOTOGRAPHY METHOD AND ELECTRONIC DEVICE |
| JP2022544829A JP7699136B2 (ja) | 2020-01-23 | 2020-11-16 | 長焦点撮影方法及び電子デバイス |
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| CN202010077034.6 | 2020-01-23 | ||
| CN202010077034.6A CN111212235B (zh) | 2020-01-23 | 2020-01-23 | 一种长焦拍摄的方法及电子设备 |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2021147482A1 true WO2021147482A1 (zh) | 2021-07-29 |
Family
ID=70787407
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/CN2020/128986 Ceased WO2021147482A1 (zh) | 2020-01-23 | 2020-11-16 | 一种长焦拍摄的方法及电子设备 |
Country Status (5)
| Country | Link |
|---|---|
| US (1) | US11968447B2 (zh) |
| EP (1) | EP4087231A4 (zh) |
| JP (1) | JP7699136B2 (zh) |
| CN (3) | CN114205522B (zh) |
| WO (1) | WO2021147482A1 (zh) |
Cited By (4)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN114339050A (zh) * | 2021-12-31 | 2022-04-12 | 西安维沃软件技术有限公司 | 显示方法、装置及电子设备 |
| CN115278030A (zh) * | 2022-07-29 | 2022-11-01 | 维沃移动通信有限公司 | 拍摄方法、装置及电子设备 |
| WO2024019312A1 (ko) * | 2022-07-17 | 2024-01-25 | 삼성전자주식회사 | 촬영을 위한 사용자 인터페이스를 위한 전자 장치, 방법, 및 비일시적 컴퓨터 판독가능 저장 매체 |
| EP4395354A4 (en) * | 2022-05-30 | 2025-01-01 | Honor Device Co., Ltd. | Photographing method and related device |
Families Citing this family (29)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US10009536B2 (en) | 2016-06-12 | 2018-06-26 | Apple Inc. | Applying a simulated optical effect based on data received from multiple camera sensors |
| DK180859B1 (en) | 2017-06-04 | 2022-05-23 | Apple Inc | USER INTERFACE CAMERA EFFECTS |
| US11112964B2 (en) | 2018-02-09 | 2021-09-07 | Apple Inc. | Media capture lock affordance for graphical user interface |
| US11722764B2 (en) | 2018-05-07 | 2023-08-08 | Apple Inc. | Creative camera |
| DK201870623A1 (en) | 2018-09-11 | 2020-04-15 | Apple Inc. | USER INTERFACES FOR SIMULATED DEPTH EFFECTS |
| US11321857B2 (en) | 2018-09-28 | 2022-05-03 | Apple Inc. | Displaying and editing images with depth information |
| US11770601B2 (en) | 2019-05-06 | 2023-09-26 | Apple Inc. | User interfaces for capturing and managing visual media |
| CN114205522B (zh) | 2020-01-23 | 2023-07-18 | 华为技术有限公司 | 一种长焦拍摄的方法及电子设备 |
| US11039074B1 (en) | 2020-06-01 | 2021-06-15 | Apple Inc. | User interfaces for managing media |
| US11212449B1 (en) * | 2020-09-25 | 2021-12-28 | Apple Inc. | User interfaces for media capture and management |
| CN112291472B (zh) * | 2020-10-28 | 2021-09-28 | Oppo广东移动通信有限公司 | 预览图像处理方法及装置、存储介质和电子设备 |
| CN113014798A (zh) * | 2021-01-27 | 2021-06-22 | 维沃移动通信有限公司 | 图像显示方法、装置及电子设备 |
| US11778339B2 (en) | 2021-04-30 | 2023-10-03 | Apple Inc. | User interfaces for altering visual media |
| CN115278043B (zh) * | 2021-04-30 | 2024-09-20 | 华为技术有限公司 | 一种目标追踪方法及相关装置 |
| KR20220151451A (ko) * | 2021-05-06 | 2022-11-15 | 삼성전자주식회사 | 복수의 카메라를 포함하는 전자 장치 및 그 동작 방법 |
| US12112024B2 (en) | 2021-06-01 | 2024-10-08 | Apple Inc. | User interfaces for managing media styles |
| CN114401362A (zh) * | 2021-12-29 | 2022-04-26 | 影石创新科技股份有限公司 | 一种图像显示方法、装置和电子设备 |
| CN117201924A (zh) * | 2022-05-25 | 2023-12-08 | 荣耀终端有限公司 | 录像方法和相关装置 |
| CN116051368B (zh) * | 2022-06-29 | 2023-10-20 | 荣耀终端有限公司 | 图像处理方法及其相关设备 |
| CN117395496A (zh) * | 2022-06-30 | 2024-01-12 | 荣耀终端有限公司 | 一种拍摄方法及相关设备 |
| CN117793245B (zh) * | 2022-09-20 | 2025-08-08 | 荣耀终端股份有限公司 | 拍摄模式的切换方法、电子设备及可读存储介质 |
| CN117956299B (zh) * | 2022-10-29 | 2024-11-08 | 华为技术有限公司 | 拍摄月亮的方法和电子设备 |
| CN118075606A (zh) * | 2022-11-22 | 2024-05-24 | 荣耀终端有限公司 | 拍摄模式切换方法及相关装置 |
| CN116366978A (zh) * | 2023-03-31 | 2023-06-30 | 联想(北京)有限公司 | 一种控制方法、装置及电子设备 |
| CN117135452B (zh) * | 2023-03-31 | 2024-08-02 | 荣耀终端有限公司 | 拍摄方法和电子设备 |
| US20240357227A1 (en) * | 2023-04-19 | 2024-10-24 | Hanwha Vision Co., Ltd. | Method and apparatus for controlling pan-tilt-zoom (ptz) camera according to setting value |
| US20240373120A1 (en) | 2023-05-05 | 2024-11-07 | Apple Inc. | User interfaces for controlling media capture settings |
| CN116980759A (zh) * | 2023-08-15 | 2023-10-31 | 维沃移动通信有限公司 | 拍摄方法、终端、电子设备及可读存储介质 |
| WO2025146902A1 (ko) * | 2024-01-03 | 2025-07-10 | 삼성전자 주식회사 | 카메라 장치를 포함하는 전자 장치 및 그 동작 방법 |
Citations (6)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN101115148A (zh) * | 2006-07-25 | 2008-01-30 | 富士胶片株式会社 | 摄像装置和图像显示控制方法 |
| KR20100093955A (ko) * | 2009-02-17 | 2010-08-26 | 삼성전자주식회사 | 디지털 영상신호 처리장치에서 이미지확대표시방법 |
| CN104333689A (zh) * | 2014-03-05 | 2015-02-04 | 广州三星通信技术研究有限公司 | 在拍摄时对预览图像进行显示的方法和装置 |
| WO2018166069A1 (zh) * | 2017-03-14 | 2018-09-20 | 华为技术有限公司 | 拍照预览方法、图形用户界面及终端 |
| CN109194839A (zh) * | 2018-10-30 | 2019-01-11 | 维沃移动通信(杭州)有限公司 | 一种显示控制方法和终端 |
| CN111212235A (zh) * | 2020-01-23 | 2020-05-29 | 华为技术有限公司 | 一种长焦拍摄的方法及电子设备 |
Family Cites Families (35)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JPH05145818A (ja) | 1991-11-21 | 1993-06-11 | Sony Corp | 撮像装置 |
| JP4003171B2 (ja) * | 2002-08-02 | 2007-11-07 | 富士フイルム株式会社 | 電子カメラ |
| KR100630149B1 (ko) * | 2005-06-07 | 2006-10-02 | 삼성전자주식회사 | 휴대단말기에서 이미지 확대/축소 방법 |
| JP4956988B2 (ja) * | 2005-12-19 | 2012-06-20 | カシオ計算機株式会社 | 撮像装置 |
| JP4687451B2 (ja) * | 2005-12-27 | 2011-05-25 | カシオ計算機株式会社 | 撮像装置、及びスルー画像表示方法 |
| KR101310823B1 (ko) * | 2006-06-20 | 2013-09-25 | 삼성전자주식회사 | 디지털 촬영장치의 제어방법 및 이 방법을 채용한 디지털촬영장치 |
| JP5053731B2 (ja) | 2007-07-03 | 2012-10-17 | キヤノン株式会社 | 画像表示制御装置及び画像表示制御方法及びプログラム及び記録媒体 |
| JP4959535B2 (ja) * | 2007-12-13 | 2012-06-27 | 株式会社日立製作所 | 撮像装置 |
| JP2009284309A (ja) * | 2008-05-23 | 2009-12-03 | Casio Comput Co Ltd | 撮像装置、表示制御プログラム及び表示制御方法 |
| EP2207342B1 (en) | 2009-01-07 | 2017-12-06 | LG Electronics Inc. | Mobile terminal and camera image control method thereof |
| JP4730569B2 (ja) * | 2009-03-27 | 2011-07-20 | カシオ計算機株式会社 | 撮影装置、撮像方法、及びプログラム |
| JP5393340B2 (ja) | 2009-08-20 | 2014-01-22 | オリンパス株式会社 | 撮像端末、表示端末、表示方法、及び撮像システム |
| JP5779959B2 (ja) * | 2011-04-21 | 2015-09-16 | 株式会社リコー | 撮像装置 |
| KR20130052372A (ko) | 2011-11-11 | 2013-05-22 | 삼성전자주식회사 | 디지털 촬영 장치 및 이의 제어 방법 |
| US20130155308A1 (en) * | 2011-12-20 | 2013-06-20 | Qualcomm Incorporated | Method and apparatus to enhance details in an image |
| CN102821238B (zh) | 2012-03-19 | 2015-07-22 | 北京泰邦天地科技有限公司 | 宽视场超高分辨率成像系统 |
| US20130314558A1 (en) | 2012-05-24 | 2013-11-28 | Mediatek Inc. | Image capture device for starting specific action in advance when determining that specific action is about to be triggered and related image capture method thereof |
| KR101545883B1 (ko) * | 2012-10-30 | 2015-08-20 | 삼성전자주식회사 | 단말의 카메라 제어 방법 및 그 단말 |
| JP6103526B2 (ja) | 2013-03-15 | 2017-03-29 | オリンパス株式会社 | 撮影機器,画像表示機器,及び画像表示機器の表示制御方法 |
| CN104679394A (zh) * | 2013-11-26 | 2015-06-03 | 中兴通讯股份有限公司 | 一种预览界面选定区域放大的方法和装置 |
| KR102170896B1 (ko) * | 2014-04-11 | 2020-10-29 | 삼성전자주식회사 | 영상 표시 방법 및 전자 장치 |
| WO2015178520A1 (ko) | 2014-05-22 | 2015-11-26 | 엘지전자 주식회사 | 이동 단말기 및 그 제어방법 |
| CN104333701B (zh) * | 2014-11-28 | 2017-04-26 | 广东欧珀移动通信有限公司 | 一种相机预览画面的显示方法、装置及终端 |
| KR20160131720A (ko) * | 2015-05-08 | 2016-11-16 | 엘지전자 주식회사 | 이동단말기 및 그 제어방법 |
| US10291842B2 (en) | 2015-06-23 | 2019-05-14 | Samsung Electronics Co., Ltd. | Digital photographing apparatus and method of operating the same |
| JP6330862B2 (ja) * | 2016-03-17 | 2018-05-30 | カシオ計算機株式会社 | 撮像装置、撮像方法及びプログラム |
| JPWO2017200049A1 (ja) | 2016-05-20 | 2019-04-11 | マクセル株式会社 | 撮像装置およびその設定画面 |
| EP3291533A1 (en) * | 2016-09-06 | 2018-03-07 | LG Electronics Inc. | Terminal and controlling method thereof |
| US10356300B2 (en) * | 2016-12-23 | 2019-07-16 | Mediatek Inc. | Seamless zooming on dual camera |
| CN106909274B (zh) * | 2017-02-27 | 2020-12-15 | 南京车链科技有限公司 | 一种图像显示方法和装置 |
| CN108429881A (zh) | 2018-05-08 | 2018-08-21 | 山东超景深信息科技有限公司 | 免通过反复变焦取景的长焦拍摄云台相机系统应用方法 |
| US10805822B2 (en) * | 2018-05-18 | 2020-10-13 | Lg Electronics Inc. | Method and apparatus for supporting measurement reporting enhancement for aerial device in wireless communication system |
| US10469726B1 (en) * | 2018-09-19 | 2019-11-05 | Inventec Besta Co., Ltd. | Reversing display system with wireless switching multi-view images and method thereof |
| CN114679538A (zh) * | 2019-05-22 | 2022-06-28 | 华为技术有限公司 | 一种拍摄方法及终端 |
| CN112333380B (zh) | 2019-06-24 | 2021-10-15 | 华为技术有限公司 | 一种拍摄方法及设备 |
-
2020
- 2020-01-23 CN CN202111345070.7A patent/CN114205522B/zh active Active
- 2020-01-23 CN CN202111345522.1A patent/CN114157804B/zh active Active
- 2020-01-23 CN CN202010077034.6A patent/CN111212235B/zh active Active
- 2020-11-16 US US17/794,831 patent/US11968447B2/en active Active
- 2020-11-16 WO PCT/CN2020/128986 patent/WO2021147482A1/zh not_active Ceased
- 2020-11-16 JP JP2022544829A patent/JP7699136B2/ja active Active
- 2020-11-16 EP EP20916157.9A patent/EP4087231A4/en active Pending
Patent Citations (6)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN101115148A (zh) * | 2006-07-25 | 2008-01-30 | 富士胶片株式会社 | 摄像装置和图像显示控制方法 |
| KR20100093955A (ko) * | 2009-02-17 | 2010-08-26 | 삼성전자주식회사 | 디지털 영상신호 처리장치에서 이미지확대표시방법 |
| CN104333689A (zh) * | 2014-03-05 | 2015-02-04 | 广州三星通信技术研究有限公司 | 在拍摄时对预览图像进行显示的方法和装置 |
| WO2018166069A1 (zh) * | 2017-03-14 | 2018-09-20 | 华为技术有限公司 | 拍照预览方法、图形用户界面及终端 |
| CN109194839A (zh) * | 2018-10-30 | 2019-01-11 | 维沃移动通信(杭州)有限公司 | 一种显示控制方法和终端 |
| CN111212235A (zh) * | 2020-01-23 | 2020-05-29 | 华为技术有限公司 | 一种长焦拍摄的方法及电子设备 |
Non-Patent Citations (1)
| Title |
|---|
| See also references of EP4087231A4 |
Cited By (5)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN114339050A (zh) * | 2021-12-31 | 2022-04-12 | 西安维沃软件技术有限公司 | 显示方法、装置及电子设备 |
| CN114339050B (zh) * | 2021-12-31 | 2023-10-31 | 西安维沃软件技术有限公司 | 显示方法、装置及电子设备 |
| EP4395354A4 (en) * | 2022-05-30 | 2025-01-01 | Honor Device Co., Ltd. | Photographing method and related device |
| WO2024019312A1 (ko) * | 2022-07-17 | 2024-01-25 | 삼성전자주식회사 | 촬영을 위한 사용자 인터페이스를 위한 전자 장치, 방법, 및 비일시적 컴퓨터 판독가능 저장 매체 |
| CN115278030A (zh) * | 2022-07-29 | 2022-11-01 | 维沃移动通信有限公司 | 拍摄方法、装置及电子设备 |
Also Published As
| Publication number | Publication date |
|---|---|
| JP7699136B2 (ja) | 2025-06-26 |
| JP2023511581A (ja) | 2023-03-20 |
| CN114157804A (zh) | 2022-03-08 |
| EP4087231A4 (en) | 2023-03-29 |
| US11968447B2 (en) | 2024-04-23 |
| CN111212235B (zh) | 2021-11-19 |
| CN114205522A (zh) | 2022-03-18 |
| US20230081664A1 (en) | 2023-03-16 |
| CN111212235A (zh) | 2020-05-29 |
| CN114157804B (zh) | 2022-09-09 |
| CN114205522B (zh) | 2023-07-18 |
| EP4087231A1 (en) | 2022-11-09 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| CN114157804B (zh) | 一种长焦拍摄的方法及电子设备 | |
| CN115484380B (zh) | 拍摄方法、图形用户界面及电子设备 | |
| WO2021093793A1 (zh) | 一种拍摄方法及电子设备 | |
| WO2021129198A1 (zh) | 一种长焦场景下的拍摄方法及终端 | |
| CN110248081A (zh) | 图像捕捉方法及电子设备 | |
| CN115633255B (zh) | 视频处理方法和电子设备 | |
| CN113709355B (zh) | 滑动变焦的拍摄方法及电子设备 | |
| CN116055868B (zh) | 一种拍摄方法及相关设备 | |
| CN114845059B (zh) | 一种拍摄方法及相关设备 | |
| CN117119285B (zh) | 一种拍摄方法 | |
| WO2023035868A1 (zh) | 拍摄方法及电子设备 | |
| CN115442509B (zh) | 拍摄方法、用户界面及电子设备 | |
| WO2022057384A1 (zh) | 拍摄方法和装置 | |
| EP4262226A1 (en) | Photographing method and related device | |
| CN116723382B (zh) | 一种拍摄方法及相关设备 | |
| CN117135448A (zh) | 拍摄的方法和电子设备 | |
| CN117119276A (zh) | 一种水下拍摄方法及电子设备 | |
| CN116709018B (zh) | 一种变焦条分割方法及电子设备 | |
| WO2024088074A1 (zh) | 拍摄月亮的方法和电子设备 | |
| CN117135268A (zh) | 拍摄的方法和电子设备 |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 20916157 Country of ref document: EP Kind code of ref document: A1 |
|
| ENP | Entry into the national phase |
Ref document number: 2022544829 Country of ref document: JP Kind code of ref document: A |
|
| ENP | Entry into the national phase |
Ref document number: 2020916157 Country of ref document: EP Effective date: 20220805 |
|
| NENP | Non-entry into the national phase |
Ref country code: DE |