WO2017018844A1 - Véhicule autonome et son procédé de fonctionnement - Google Patents
Véhicule autonome et son procédé de fonctionnement Download PDFInfo
- Publication number
- WO2017018844A1 WO2017018844A1 PCT/KR2016/008328 KR2016008328W WO2017018844A1 WO 2017018844 A1 WO2017018844 A1 WO 2017018844A1 KR 2016008328 W KR2016008328 W KR 2016008328W WO 2017018844 A1 WO2017018844 A1 WO 2017018844A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- driving environment
- autonomous vehicle
- image
- virtual
- virtual driving
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Images
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K35/00—Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
- B60K35/10—Input arrangements, i.e. from user to vehicle, associated with vehicle functions or specially adapted therefor
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K35/00—Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
- B60K35/20—Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor
- B60K35/21—Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor using visual output, e.g. blinking lights or matrix displays
- B60K35/23—Head-up displays [HUD]
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W40/00—Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
- B60W40/02—Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to ambient conditions
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W40/00—Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
- B60W40/10—Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to vehicle motion
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W40/00—Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
- B60W40/10—Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to vehicle motion
- B60W40/105—Speed
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W50/00—Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
- B60W50/08—Interaction between the driver and the control system
- B60W50/14—Means for informing the driver, warning the driver or prompting a driver intervention
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
Definitions
- the present disclosure relates to an autonomous vehicle and a method of operation thereof.
- an autonomous vehicle includes a display device formed in an area of a car window of an autonomous vehicle; And a processor that controls the display device to display a virtual driving environment image that replaces the actual driving environment of the autonomous vehicle.
- an operating method of an autonomous vehicle includes: obtaining a virtual driving environment image in place of an actual driving environment of the autonomous vehicle; And controlling a display device formed in an area of a car window of an autonomous vehicle to display a virtual driving environment image.
- a computer readable recording medium having recorded thereon a program for implementing the method.
- an autonomous vehicle may include an input device configured to receive a virtual driving environment from a user; And a windshield displaying the selected virtual driving environment.
- the virtual driving environment image which replaces the actual driving environment, is provided to the occupant through a display device formed in the area of the window of the autonomous vehicle, and thus the passenger can experience the virtual driving environment more realistically. Can provide.
- FIG. 1 is a diagram illustrating an autonomous vehicle according to an embodiment.
- FIG. 2 is a block diagram illustrating detailed hardware configurations of an autonomous vehicle according to an embodiment.
- FIG. 3 is a block diagram of an autonomous vehicle according to an embodiment.
- FIG. 4 is a diagram illustrating a windshield of an autonomous vehicle according to an exemplary embodiment.
- FIG. 5 is a diagram for describing a display apparatus, according to an exemplary embodiment.
- FIG. 6 is a diagram for describing a display apparatus, according to another exemplary embodiment.
- FIG. 7 is a diagram illustrating a UI for determining a driving route, according to an exemplary embodiment.
- FIG. 8 is a diagram illustrating a UI for setting a virtual reality, according to an exemplary embodiment.
- FIG. 9 is a diagram for describing generating a virtual driving environment image.
- FIG. 10 is a diagram illustrating an embodiment of generating virtual driving environment images corresponding to points at which an autonomous vehicle travels straight.
- FIG. 11 is a diagram illustrating an embodiment of generating virtual driving environment images corresponding to points at which an autonomous vehicle travels in a right turn.
- FIGS. 12 and 13 illustrate an embodiment of generating a plurality of virtual driving environment images corresponding to a point on a driving route.
- FIG. 14 is a diagram illustrating an embodiment of a camera of an autonomous vehicle.
- 15 is a diagram illustrating an embodiment in which a processor generates a virtual driving environment image based on an image of an actual driving environment.
- FIG. 16 is a diagram illustrating a UI for selecting an area of a window to display a virtual driving environment according to an exemplary embodiment.
- FIG. 17 is a diagram illustrating an embodiment in which a virtual driving environment image is displayed on an area of a car window corresponding to a line of sight of a passenger.
- FIG. 18 is a diagram illustrating a UI for selecting content to display on a display apparatus, according to an exemplary embodiment.
- 19 is a diagram illustrating an embodiment of displaying a movie on a display device.
- 20 is a diagram illustrating a UI for setting an event, according to an exemplary embodiment.
- FIG. 21 is a diagram for one embodiment of providing a passenger with information on an event when a predetermined event occurs.
- FIG. 22 is a diagram for one embodiment of providing a passenger with information about an event when a predetermined event occurs.
- FIG. 23 is a diagram for one embodiment of providing a passenger with information about an event when a predetermined event occurs.
- 24 is a flowchart of a method of operating an autonomous vehicle, according to an exemplary embodiment.
- FIG. 25 is a detailed flowchart embodying step 2420 of FIG. 24.
- 26 is a detailed flowchart of a method of operating an autonomous vehicle according to an embodiment.
- an autonomous vehicle includes a display device formed in an area of a car window of an autonomous vehicle; And a processor that controls the display device to display a virtual driving environment image that replaces the actual driving environment of the autonomous vehicle.
- the virtual driving environment image may be an image showing a virtual driving environment outside of the autonomous vehicle viewed from the viewpoint of the inside of the autonomous vehicle toward the area of the vehicle window.
- the processor may obtain information about a driving route from a current position of the autonomous vehicle to a destination, and generate virtual driving environment images corresponding to each of the points on the driving route.
- the apparatus may further include a motion sensing device that senses a movement of the autonomous vehicle, and the processor may control the display device to display the virtual driving environment images based on the sensed movement.
- the motion sensing device senses the traveling speed of the autonomous vehicle
- the processor may control the rate of change of the image between the virtual driving environment images displayed on the display device based on the sensed driving speed.
- the processor may control an image change rate between virtual driving environment images displayed on each of the plurality of display apparatuses based on the sensed movement.
- the apparatus may further include an image sensor configured to capture an image of an actual driving environment, and the processor may generate a virtual driving environment image based on the captured image of the actual driving environment.
- the processor may generate a virtual driving environment image reflecting the appearance of the object shown in the image of the actual driving environment.
- the processor may generate a virtual driving environment image based on a virtual reality selected from a passenger of the autonomous vehicle among a plurality of virtual realitys.
- the processor may determine whether a predetermined event occurs, and when an event occurs, the processor of the autonomous vehicle may control the actual driving environment corresponding to the event through the display device.
- an operating method of an autonomous vehicle includes: obtaining a virtual driving environment image in place of an actual driving environment of the autonomous vehicle; And controlling a display device formed in an area of a car window of an autonomous vehicle to display a virtual driving environment image.
- a computer readable recording medium having recorded thereon a program for implementing the method.
- an autonomous vehicle may include an input device configured to receive a virtual driving environment from a user; And a windshield displaying the selected virtual driving environment.
- the term “consisting of” or “comprising” should not be construed as including all of the various elements, or steps, described in the specification, and some or some of them may be included. Should not be included, or should be construed to further include additional components or steps.
- the terms “... unit”, “module”, etc. described in the specification mean a unit for processing at least one function or operation, which may be implemented in hardware or software or a combination of hardware and software. .
- FIG. 1 is a diagram illustrating an autonomous vehicle according to an embodiment.
- the autonomous vehicle 1 may refer to a vehicle that can drive itself without intervention of a passenger.
- the autonomous vehicle 1 may display a virtual driving environment image that replaces the actual driving environment of the autonomous vehicle 1.
- the autonomous vehicle 1 may display an image representing a virtual driving environment different from the actual driving environment around the autonomous vehicle 1. For example, if the autonomous vehicle 1 is traveling in the city center, there may be many buildings in the actual driving environment around the autonomous vehicle 1, but the autonomous vehicle 1 represents a forest.
- the virtual driving environment image may be displayed. Therefore, the passenger of the autonomous vehicle 1 may experience that the autonomous vehicle 1 is driving in the forest, not in the city center, through the virtual driving environment image displayed by the autonomous vehicle 1.
- the autonomous vehicle 1 may display a virtual driving environment image through a display device formed in an area of a car window of the autonomous vehicle 1. Therefore, when the occupant looks at the area of the window of the autonomous vehicle 1, the occupant may view the virtual driving environment image displayed on the display device formed in the area of the window, so that the actual driving around the autonomous vehicle 1 is performed.
- a virtual driving environment rather than an environment.
- the autonomous vehicle 1 may display a virtual driving environment image through a display device in conjunction with the movement of the autonomous vehicle 1, and thus, the autonomous vehicle 1 may be virtually realized.
- the driving environment may display a virtual driving environment image through a display device in conjunction with the movement of the autonomous vehicle 1, and thus, the autonomous vehicle 1 may be virtually realized.
- FIG. 2 is a block diagram illustrating detailed hardware configurations of an autonomous vehicle according to an embodiment.
- the autonomous vehicle 1 includes a propulsion device 210, a power supply device 299, a communication device 250, an input device 260, an output device 280, a storage device 270, and a travel device 220. , The sensing device 230, the peripheral device 240, and the control device 290. However, the autonomous vehicle 1 may further include other general-purpose components in addition to the components illustrated in FIG. 2, or the autonomous vehicle 1 may not include some of the components illustrated in FIG. 2. It may be understood by those of ordinary skill in the art related to the present embodiment.
- the propulsion device 210 may include an engine / motor 211, an energy source 212, a transmission 213 and a wheel / tire 214.
- the engine / motor 211 may be any combination between an internal combustion engine, an electric motor, a steam engine, and a stirling engine.
- the engine / motor 211 may be a gasoline engine and an electric motor.
- Energy source 212 may be a source of energy that powers the engine / motor 211 in whole or in part. That is, engine / motor 211 may be configured to convert energy source 212 into mechanical energy. Examples of energy sources 212 may be at least one of gasoline, diesel, propane, other compressed gas based fuels, ethanol, solar panels, batteries, and other electrical power sources. Alternatively, the energy source 212 may be at least one of a fuel tank, a battery, a capacitor, and a flywheel. The energy source 212 can provide energy to the systems and devices of the autonomous vehicle 1.
- Transmission 213 may be configured to transfer mechanical power from engine / motor 211 to wheel / tire 214.
- the transmission 213 may include at least one of a gearbox, a clutch, a differential, and a drive shaft.
- the drive shafts may include one or more axles configured to be coupled to the wheel / tire 214.
- the wheel / tire 214 may be configured in a variety of formats, including a unicycle, a bike / motorcycle, a tricycle, or a four wheel type of a car / truck. For example, other wheel / tire types may be possible, such as including six or more wheels.
- the wheel / tire 214 includes at least one wheel fixedly attached to the transmission 213 and at least one tire coupled to a rim of the wheel that can contact the driving surface. can do.
- the traveling device 220 may include a brake unit 221, a steering unit 222, and a throttle 223.
- the steering unit 222 may be a combination of mechanisms configured to adjust the direction of the autonomous vehicle 1.
- the throttle 223 may be a combination of mechanisms configured to control the speed of operation of the engine / motor 211 to control the speed of the autonomous vehicle 1.
- the throttle 223 may adjust the amount of throttle opening to adjust the amount of mixed gas of fuel air flowing into the engine / motor 211, and may control power and thrust by adjusting the throttle opening.
- the brake unit 221 may be a combination of mechanisms configured to decelerate the autonomous vehicle 1.
- the brake unit 221 may use friction to reduce the speed of the wheel / tire 214.
- the sensing device 230 may include a plurality of sensors configured to sense information about the environment in which the autonomous vehicle 1 is located, as well as one or more actuators configured to modify the position and / or orientation of the sensors. Can include them.
- the sensing device 230 includes a Global Positioning System (GPS) 224, an Inertial Measurement Unit (IMU) 225, a RADAR unit 226, a LIDAR unit 227, and an image sensor 228. can do.
- the sensing device 230 may include at least one of a temperature / humidity sensor 232, an infrared sensor 233, a barometric pressure sensor 235, a proximity sensor 236, and an RGB sensor illuminance sensor 237. It may be, but is not limited thereto. Since functions of the respective sensors can be intuitively deduced by those skilled in the art from the names, detailed descriptions thereof will be omitted.
- the sensing device 230 may include a motion sensing device 238 capable of sensing the movement of the autonomous vehicle 1.
- the motion sensing device 238 may include a geomagnetic sensor 229, an acceleration sensor 231, and a gyroscope sensor 234.
- the GPS 224 may be a sensor configured to estimate the geographic location of the autonomous vehicle 1. That is, the GPS 224 may include a transceiver configured to estimate the position of the autonomous vehicle 1 with respect to the earth.
- the IMU 225 may be a combination of sensors configured to detect positional and orientation changes of the autonomous vehicle 1 based on inertial acceleration.
- the combination of sensors may include accelerometers and gyroscopes.
- the RADAR unit 226 may be a sensor configured to detect objects in the environment in which the autonomous vehicle 1 is located using a wireless signal.
- the RADAR unit 226 can be configured to sense the speed and / or direction of the objects.
- the LIDAR unit 227 may be a sensor configured to detect objects in the environment in which the autonomous vehicle 1 is located using a laser. More specifically, LIDAR unit 227 may include a laser light source and / or laser scanner configured to emit a laser, and a detector configured to detect reflection of the laser. The LIDAR unit 227 may be configured to operate in coherent (eg, using hetirodyne detection) or noncoherent detection mode.
- the image sensor 228 may be a still camera or a video camera configured to record three-dimensional images of the interior of the autonomous vehicle 1.
- the image sensor 228 may include a number of cameras, which may be disposed at a number of locations on the inside and outside of the autonomous vehicle 1.
- the peripheral device 240 may include a navigation 241, a light 242, a turn signal 243, a wiper 244, an interior light 245, a heater 246, and an air conditioner 247.
- the navigation 241 may be a system configured to determine a travel route for the autonomous vehicle 1.
- the navigation 241 may be configured to dynamically update the travel route while the autonomous vehicle 1 is traveling.
- the navigation 241 may use data from the GPS 224 and maps to determine the route of travel for the autonomous vehicle 1.
- the storage device 270 may include a magnetic disk drive, an optical disk drive, and a flash memory. Alternatively, the storage device 270 may be a portable USB data storage device. Storage device 270 may store system software for executing examples related to the present disclosure. System software for carrying out the examples relating to the present disclosure may be stored on a portable storage medium.
- the communication device 250 may include at least one antenna for wirelessly communicating with another device.
- communication device 250 may be used to communicate with a cellular network or other wireless protocols and systems wirelessly via Wi-Fi or Bluetooth.
- the communication device 250 controlled by the control device 290 may transmit and receive a wireless signal.
- the control device 290 may execute a program included in the storage device 270 in order for the communication device 250 to transmit and receive a wireless signal with the cellular network.
- the input device 260 means a means for inputting data for controlling the autonomous vehicle 1.
- the input device 260 may include a key pad, a dome switch, a touch pad (contact capacitive type, pressure resistive type, infrared sensing type, surface ultrasonic conduction type, and integral type). Tension measurement method, piezo effect method, etc.), a jog wheel, a jog switch, and the like, but are not limited thereto.
- the input device 260 may include a microphone, which may be configured to receive audio (eg, voice commands) from the occupant of the autonomous vehicle 1.
- the output device 280 may output an audio signal or a video signal, and the output device 280 may include a display 281 and a sound output unit 282.
- the display unit 281 may be a liquid crystal display, a thin film transistor-liquid crystal display, an organic light-emitting diode, a flexible display, or a three-dimensional display. 3D display, an electrophoretic display.
- the output device 280 may include two or more display units 281.
- the sound output unit 282 outputs audio data received from the communication device 250 or stored in the storage device 270.
- the sound output unit 282 may include a speaker, a buzzer, and the like.
- the input device 260 and the output device 280 may include a network interface, and may be implemented as a touch screen.
- the control device 290 typically controls the overall operation of the autonomous vehicle 1.
- the control device 290 executes the programs stored in the storage device 270, such that the propulsion device 210, the traveling device 220, the sensing device 230, the peripheral device 240, and the communication device ( 250, the input device 260, the storage device 270, the output device 280, and the power supply 299 may be controlled overall.
- the control device 290 may control the movement of the autonomous vehicle 1.
- the power supply 299 may be configured to provide power to some or all of the components of the autonomous vehicle 1.
- power supply 299 may comprise a rechargeable lithium ion or lead-acid battery.
- FIG. 3 is a block diagram of an autonomous vehicle according to an embodiment.
- the autonomous vehicle 1 may include a display device 110 and a processor 120.
- a display device 110 In the autonomous vehicle 1 shown in FIG. 3, only components related to the present embodiment are shown. Therefore, it will be understood by those skilled in the art that other general purpose components other than the components shown in FIG. 3 may be further included.
- the display device 110 may include the display 281 of FIG. 2, and the processor 120 may correspond to the control device 290 of FIG. 2.
- the display apparatus 110 may be formed in an area of a vehicle window of the autonomous vehicle 1.
- FIG. 4 is a diagram illustrating a windshield of an autonomous vehicle according to an exemplary embodiment.
- the window of the autonomous vehicle 1 is a window 401 corresponding to the front surface of the autonomous vehicle 1, a car window 402 corresponding to the right side of the autonomous vehicle 1, and a left side of the autonomous vehicle 1.
- the window 403 corresponding to the surface, the window 404 corresponding to the rear of the autonomous vehicle 1, and the window 405 corresponding to the ceiling of the autonomous vehicle 1 may be used.
- the autonomous vehicle 1 may include a display device formed in at least one area of the vehicle windows 401, 402, 403, 404, 405.
- the autonomous vehicle 1 includes windows corresponding to five regions, but the present invention is not limited thereto, and the autonomous vehicle 1 may have different positions, different sizes, and different positions from those of FIG. 4.
- the windows may be provided in a form or the like.
- the display apparatus 110 may be a transparent display formed in an area of a vehicle window.
- the display device 110 may be a transparent display to replace the windshield. That is, the display device 110 may be a transparent display having a function for a display and a function as a window.
- the display device 110 may be configured as a transparent electrode. When a voltage is applied to the display device 110, the display device 110 may have a function for display, and the display device 110 may be used. If no voltage is applied), it may have a function as a windshield.
- the display device 110 may have a size of an area of the window and may be formed on the surface of the window.
- the display device 110 may be slidably coupled to the window.
- FIG. 5 is a diagram for describing a display apparatus, according to an exemplary embodiment.
- the display apparatus 110 may be a transparent display formed in an area of the windshield 501 of the autonomous vehicle 1. That is, the display device 110 may be the transparent display 502 closely bonded to one surface of the window 501. According to an example, the display apparatus 110 may be a flexible thin film type and may be configured as an element that may transmit light and output an image of high brightness. Such a device may be any one of an LCD, an LED, and a transparent organic light emission diode (TOLED).
- TOLED transparent organic light emission diode
- the front window of the autonomous vehicle 1 is illustrated according to an example.
- the display device 110 as a transparent display may be formed in an area of another vehicle window of the autonomous vehicle 1.
- FIG. 6 is a diagram for describing a display apparatus, according to another exemplary embodiment.
- the display apparatus 110 may have a size of the window 601 of the autonomous vehicle 1 and may be slidably coupled to the window 601. That is, the display device 110 may slide in one direction and overlap the area of the window 601 entirely, and may not exist in the other direction by sliding in the other direction and overlapping the area of the window 601.
- the slidable display apparatus 110 may be formed in an area of another vehicle window of the autonomous vehicle 1.
- the processor 120 may generate a virtual driving environment image that replaces the actual driving environment of the autonomous vehicle 1.
- the virtual driving environment image refers to an image showing a virtual driving environment outside the autonomous vehicle 1 as viewed toward an area of a vehicle window from an internal viewpoint of the autonomous vehicle 1.
- the virtual driving environment image refers to an image representing the virtual driving environment outside the autonomous vehicle 1 that the occupant of the autonomous vehicle 1 may look toward the area of the vehicle window.
- the virtual driving environment may be a driving environment in virtual reality in which the actual driving environment is partially reflected.
- a real driving environment may be a road in a rainy city, but a virtual driving environment may be a road in a city with bright sunlight. Therefore, the virtual driving environment image may represent a virtual driving environment that the occupant can recognize as the actual driving environment when the passenger looks out of the autonomous vehicle 1 toward the area of the vehicle window.
- the processor 120 may generate a virtual driving environment image based on the information about the actual driving environment and the information about the virtual reality around the autonomous driving vehicle 1.
- the information on the actual driving environment may include information on a driving route for the autonomous vehicle 1 to travel to a destination, and may include an image of the actual driving environment.
- the processor 120 may obtain information about the virtual reality from the storage device 270 of FIG. 2, or may obtain information about the virtual reality from an external network.
- the virtual reality may be determined by a passenger's selection of the plurality of virtual realitys.
- the processor 120 may generate a virtual driving environment image based on the information on the driving route from the current position of the autonomous vehicle 1 to the destination. More specifically, the processor 120 may acquire information about a driving route from the current position of the autonomous driving vehicle 1 to the destination, and reflect the driving route acquired in the preset virtual reality, and thus the virtual driving environment image. Can be generated. For example, the processor 120 may generate a virtual driving environment image by reflecting a state of a road corresponding to a driving route in a virtual reality representing a coast. According to an example, the processor 120 may obtain information about a destination from the occupant, and determine a driving route from the current position of the autonomous vehicle 1 to the destination. According to another example, the navigation 241 of FIG. 2 may determine the driving route from the current position of the autonomous vehicle 1 to the destination, and the processor 120 obtains information about the driving route from the navigation 241. can do.
- the processor 120 may generate a virtual driving environment image corresponding to a point on the driving route. In other words, based on the point on the travel path where the autonomous vehicle 1 may be located, the processor 120 may display the autonomous vehicle 1 that the occupant of the autonomous vehicle 1 may look toward the area of the vehicle window. An image representing an external virtual driving environment may be generated. Similarly, the processor 120 may generate virtual driving environment images corresponding to each of the points based on the points on the driving route of the autonomous vehicle 1.
- FIG. 7 is a diagram illustrating a UI for determining a driving route, according to an exemplary embodiment.
- the processor 120 may provide a passenger with a UI 710 for determining a driving route. According to an example, the processor 120 may display the UI 710 on the display device 110 or on a separate display.
- the occupant may input information on a desired destination through the input device 260 in the area 701 for inputting destination information in the UI 710. Accordingly, the occupant may enter a destination '1600 Pennsylvania Ave, D.C' in the area 701. The occupant may then select a travel route to the destination via the additional setting area 702. In other words, as shown in FIG. 7, the occupant may select a driving route via the freeway among several driving routes to the destination. Therefore, the processor 120 may determine the driving route via the highway selected by the occupant as the driving route to the destination of the autonomous vehicle 1.
- FIG. 8 is a diagram illustrating a UI for setting a virtual reality, according to an exemplary embodiment.
- the processor 120 may provide a passenger with a UI 810 for setting up virtual reality.
- the occupant may select one of the plurality of virtual reality through the UI 810.
- the occupants can access the UI 810 via the Rocky Mountains, Amazon Rainforest, Saharan Safari, Grand Canyon, Hawaii Volcanoes, and Big Sur. Virtual reality corresponding to any one of Big Sur, California) and a rolling irish hill.
- the passenger may download another virtual reality from the external network by selecting the download menu 801.
- the processor 120 may first provide the occupant with the UI 710 of FIG. 7 to determine the driving route to the destination of the autonomous vehicle 1, and then the UI 810 of FIG. 8. May be provided to the occupant to determine the virtual reality. Therefore, the processor 120 may generate a virtual driving environment image by using the determined driving route and the virtual reality.
- FIG. 9 is a diagram for describing generating a virtual driving environment image.
- the processor 120 generates a virtual driving environment image 930 corresponding to the partial section 910 based on the partial section 910 of the driving path of the autonomous vehicle 1 and the partial region 920 of the virtual reality. can do. That is, the processor 120 may generate a virtual driving environment image 930, which is a virtual driving environment that can be viewed by the occupant at the point 915 based on the point 915 where the autonomous vehicle 1 will travel in the future. Can be. More specifically, the processor 120 may recognize the road appearance of the section 910 based on the point 915, and reflect the recognized road appearance in the partial region 920 of the virtual reality, thereby providing a virtual driving environment. An image 930 may be generated.
- the processor 120 since the road shape of the section 910 is a section that turns left after going straight a certain distance, the processor 120 reflects the road form that turns left after going a certain distance in the partial region 920 of the virtual reality, the virtual driving environment An image 930 may be generated. Similarly, the processor 120 may recognize a road view for each of the remaining sections of the driving route of the autonomous vehicle 1, and reflect the road view recognized for each section to other regions of the virtual reality, thereby autonomously driving the vehicle 1. A plurality of virtual driving environment images corresponding to the entire driving route may be generated.
- FIG. 10 is a diagram illustrating an embodiment of generating virtual driving environment images corresponding to points at which an autonomous vehicle travels straight.
- the processor 120 may generate virtual driving environment images 1020 and 1030 based on the points 1010 and 1015 on the driving route.
- the processor 120 determines that the autonomous vehicle 1 is located at the point 1010 based on the fact that the autonomous vehicle 1 is located at the point 1010.
- the driving environment image 1020 may be generated, and then the virtual driving environment image 1030 may be generated based on the position of the autonomous vehicle 1 at the point 1015.
- the virtual driving environment image 1020 may represent a virtual driving environment outside the autonomous driving vehicle 1 that the occupant may look toward the area of the vehicle window when the autonomous driving vehicle 1 is located at the point 1010.
- the virtual driving environment image 1030 may represent a virtual driving environment outside the autonomous driving vehicle 1 that the occupant may look toward the area of the vehicle window when the autonomous driving vehicle 1 is located at the point 1015. have. Accordingly, some objects 1026 of the virtual driving environment of the virtual driving environment image 1020 may disappear from the virtual driving environment image 1030, and some objects 1022, of the virtual driving environment of the virtual driving environment image 1020.
- the size and shape of the image 1024 may be expressed in a virtual driving environment image 1030 by being changed to look closer, such as some objects 1032 and 1034.
- the processor 120 continuously provides virtual passenger environment images 1020 and 1030 to the occupant so that the autonomous vehicle 1 travels to the occupant 1010 and 1015 in a straight line.
- the actual driving environment is a road in the city, but since the virtual driving environment shown in the virtual driving environment images 1020 and 1030 is a seaside road, the processor 120 continuously occupies the virtual driving environment images 1020 and 1030.
- FIG. 10 illustrates an example in which the processor 120 generates a virtual driving environment image 1020 and a virtual driving environment image 1030 corresponding to a point 1010 and a point 1015 on a driving route, but to a passenger.
- the processor 120 may generate virtual driving environment images corresponding to many points on the driving route.
- FIG. 11 is a diagram illustrating an embodiment of generating virtual driving environment images corresponding to points at which an autonomous vehicle travels in a right turn.
- the processor 120 may generate virtual driving environment images 1120 and 1130 based on the points 1110 and 1115 on the driving route.
- the processor 120 may determine that the autonomous vehicle 1 is located at the point 1110.
- the driving environment image 1120 may be generated, and then the virtual driving environment image 1130 may be generated based on the position of the autonomous vehicle 1 at the point 1115.
- the virtual driving environment image 1120 may represent a virtual driving environment outside the autonomous driving vehicle 1 that the occupant may look toward the area of the vehicle window when the autonomous driving vehicle 1 is located at the point 1110.
- the virtual driving environment image 1130 may represent a virtual driving environment outside the autonomous driving vehicle 1 that the occupant may look toward the area of the vehicle window when the autonomous driving vehicle 1 is located at the point 1115. have. Accordingly, the processor 120 continuously provides the virtual driving environment images 1120 and 1130 to the occupant so that the autonomous vehicle 1 travels to the occupant 1110 and 1115 on the driving route to the right. Can be provided.
- the virtual driving environment shown in the virtual driving environment images 1020 and 1030 is a road surrounded by trees, so that the processor 120 continuously processes the virtual driving environment images 1120 and 1130.
- the processor 120 continuously processes the virtual driving environment images 1120 and 1130.
- FIGS. 12 and 13 illustrate an embodiment of generating a plurality of virtual driving environment images corresponding to a point on a driving route.
- the processor 120 may generate a plurality of virtual driving environment images 1210, 1220, and 1230 corresponding to a point 1205 on a driving route. More specifically, the processor 120 is a virtual driving environment outside the autonomous vehicle 1 that the occupant can look toward the area of the front windshield when the autonomous vehicle 1 is located at the point 1205.
- a driving environment image 1210 may be generated, and a virtual driving environment image 1220 may be generated, which is a virtual driving environment outside the autonomous vehicle 1 which the occupant may look toward the area of the left side window pane.
- a virtual driving environment image 1230 which is a virtual driving environment outside the autonomous vehicle 1, which the occupant can look toward the area of the right side vehicle window, may be generated.
- the passenger feels more. You can experience a virtual driving environment.
- the processor 120 may generate a plurality of virtual driving environment images 1310, 1320, and 1330 corresponding to the point 1305 on the driving route. More specifically, the processor 120 is a virtual driving environment outside the autonomous vehicle 1 that the occupant can view toward the area of the front window when the autonomous vehicle 1 is located at the point 1305.
- the driving environment image 1310 may be generated, and a virtual driving environment image 1320 may be generated, which is a virtual driving environment outside the autonomous driving vehicle 1 that the occupant may look toward the area of the left side window pane.
- a virtual driving environment image 1330 which is a virtual driving environment outside the autonomous vehicle 1, which the occupant may look toward the area of the right side vehicle window, may be generated.
- a plurality of virtual driving environment images 1210, 1220, 1230 corresponding to the point 1205 As the plurality of virtual driving environment images 1310, 1320, and 1330 corresponding to the point 1205 are continuously displayed, the occupant may realize the autonomous vehicle 1 as a result of the points 1205 and 1305 on the driving route. You can experience driving straight ahead.
- the actual driving environment is a rainy road
- the virtual driving environment shown in the plurality of virtual driving environment images 1210, 1220, 1230, 1310, 1320, and 1330 is a road with sunlight, and thus, the processor 120 includes a plurality of roads.
- the autonomous driving vehicle 1 is provided to the passenger.
- the processor 120 may provide an external virtual driving environment that the occupant can see through other windows within the autonomous vehicle 1.
- a virtual driving environment image may be generated.
- the processor 120 may generate a virtual driving environment image based on an image of the actual driving environment of the autonomous vehicle 1. More specifically, the processor 120 may generate a virtual driving environment image reflecting the appearance of an object appearing in the image of the actual driving environment. For example, the processor 120 may generate a virtual driving environment image reflecting the appearance of the road shown in the image of the actual driving environment. In addition, the processor 120 may generate a virtual driving environment image in which the movement trajectory or the change rate of the object included in the image of the actual driving environment is reflected. For example, the processor 120 may generate a virtual driving environment image that reflects the movement trajectory and the speed of the vehicle that appear in the image of the actual driving environment.
- the image sensor 228 of FIG. 2 may capture an image of the actual driving environment of the autonomous vehicle 1
- the processor 120 may capture an image of the actual driving environment captured by the image sensor 228. Based on the image, the virtual driving environment image may be generated.
- the processor 120 may obtain an image of the actual driving environment of the autonomous vehicle 1 from an external network.
- FIG. 14 is a diagram illustrating an embodiment of a camera of an autonomous vehicle.
- the cameras 1410, 1420, 1430, and 1440 may be installed on the outer surfaces of the window windows 401, 402, 403, 404 of the autonomous vehicle 1. That is, the cameras 1410, 1420, 1430, and 1440 may include a car window 401 corresponding to the front side of the autonomous vehicle 1, a car window 403 corresponding to the left side, a car window 402 corresponding to the right side, and a rear side. It may be installed on the outer surface of each of the corresponding window 404.
- the cameras 1410, 1420, 1430, and 1440 may photograph an actual driving environment of the autonomous vehicle 1, which the occupant can look toward the area of the vehicle window, and acquire an image of the actual driving environment. .
- 15 is a diagram illustrating an embodiment in which a processor generates a virtual driving environment image based on an image of an actual driving environment.
- the image sensor 228 may be installed in the front windshield of the autonomous vehicle 1, and the image sensor 228 may photograph an actual driving environment that a passenger can see through the front windshield of the autonomous vehicle 1. .
- the processor 120 may acquire an actual driving environment image 1510 captured by the image sensor 228.
- the processor 120 may acquire the virtual reality 1520 selected by the occupant. Therefore, the processor 120 may generate the virtual driving environment image 1530 based on the actual driving environment image 1510 and the virtual reality 1520.
- the processor 120 may recognize the road image through the actual driving environment image 1510, and generate the virtual driving environment image 1530 by reflecting the recognized road image in the virtual reality 1520. can do.
- the processor 120 since the road shape of the actual driving environment image 1510 is a section in which the user turns left after going straight for a predetermined distance, the processor 120 reflects the road shape in which the user turns left after going straight in the virtual reality 1520, the virtual driving environment image. 1530 may be generated. Therefore, when the virtual driving environment image 1530 is displayed on the display device 110 formed in the area of the front window of the autonomous vehicle 1, the occupant may recognize the virtual driving environment image 1530 as the actual driving environment. have.
- the processor 120 may recognize an object appearing in the actual driving environment image 1510 and determine whether to reflect the recognized object in the virtual driving environment image 1530. For example, the processor 120 may determine to reflect an object such as a traffic light and a crosswalk that appear in the actual driving environment image 1510 in the virtual driving environment image 1530. In addition, as shown in FIG. 15, the processor 120 may recognize the vehicles 1511, 1512, and 1513 on the road of the actual driving environment image 1510, and recognize the vehicles 1511, 1512, and 1513. It may be determined that the virtual driving environment image 1530 is not displayed.
- the processor 120 may recognize the road area through the actual driving environment image 1510, and replace the area other than the road area with the virtual reality 1520 within the actual driving environment image 1510. Can be. That is, when the actual driving environment image 1510 is a road area surrounded by buildings, and the virtual reality 1520 is in a forest with trees, the processor 120 is an area corresponding to the buildings in the actual driving environment image 1510. By replacing with a forest area, a virtual driving environment image 1530 can be generated.
- the processor 120 may recognize the driving route of the autonomous vehicle 1 through the road area shown in the actual driving environment image 1510, and not only the virtual driving environment image 1530 but also the driving route. Other virtual driving environment images corresponding to each of the points on the image may be generated.
- an example of generating a virtual driving environment image 1530 is generated based on a camera installed in a front window of the autonomous vehicle 1, but the processor 120 is similarly installed in another window of the autonomous vehicle 1. Based on the camera, another virtual driving environment image may be generated. That is, the processor 120 may use the actual driving environment image acquired through the camera installed in the other vehicle window of the autonomous vehicle 1 to display another virtual driving environment to be displayed on the display device 110 formed in the area of the other vehicle window. An image can be generated.
- the processor 120 may control the display apparatus 110 formed in the area of the vehicle window of the autonomous vehicle 1 to display a virtual driving environment image. Therefore, the occupant can experience the virtual driving environment as if he / she looks at the area of the car window inside the autonomous vehicle 1, as if experiencing the actual driving environment. That is, the processor 120 may intend the occupant to mistake the actual driving environment through the virtual driving environment shown in the virtual driving environment image.
- the processor 120 may control the display apparatus 110 to continuously display virtual driving environment images corresponding to each of the points on the driving route of the autonomous driving vehicle 1. That is, the processor 120 may generate virtual driving environment images corresponding to each of the points on the driving route, and control the display apparatus 110 to continuously display the generated virtual driving environment images.
- the processor 120 may control the display apparatus 110 formed in the area of the vehicle window to continuously display the virtual driving environment images 1020 and 1030. Accordingly, the occupant may view the virtual driving environment images 1020 and 1030 continuously displayed through the area of the car window, and thus the occupant may experience the virtual driving environment shown in the virtual driving environment images 1020 and 1030. The occupant may recognize that the autonomous vehicle 1 is going straight in the virtual driving environment.
- the processor 120 may control the display apparatus 110 formed in the area of the vehicle window to continuously display the virtual driving environment images 1120 and 1130. Accordingly, the occupant may view the virtual driving environment images 1120 and 1130 continuously displayed through the area of the vehicle window, so that the occupant may experience the virtual driving environment shown in the virtual driving environment images 1120 and 1130. The occupant may recognize that the autonomous vehicle 1 turns right in the virtual driving environment.
- the processor 120 may control the display apparatus 110 to display a virtual driving environment image in association with the movement of the autonomous vehicle 1.
- the processor 120 acquires a virtual driving environment image corresponding to an operation in which the autonomous vehicle 1 travels straight, and a virtual driving environment image corresponding to an operation in which the autonomous vehicle 1 rotates left or right, respectively, as a video. can do.
- the processor 120 may obtain a virtual driving environment image corresponding to the driving operation of the autonomous vehicle 1 from an external network.
- the processor 120 may determine the autonomous vehicle 1.
- the virtual driving environment image corresponding to the driving operation may be generated. Therefore, when the autonomous vehicle 1 travels straight, the processor 120 may control the display apparatus 110 to reproduce the virtual driving environment image corresponding to the straight driving as a moving image.
- the processor 120 may control the display apparatus 110 to reproduce a virtual driving environment image corresponding to the left turn driving or the right turn driving as a moving image. Therefore, the processor 120 may display the virtual driving environment image through the display apparatus 110 in association with the movement of the autonomous vehicle 1, and thus the passenger may realize the autonomous vehicle 1 more realistically. You can experience that you are driving in a virtual driving environment.
- the motion sensing device 238 of FIG. 2 may sense the motion of the autonomous vehicle 1, and the processor 120 may be based on the motion of the autonomous vehicle 1 sensed by the motion sensing device 238.
- the display apparatus 110 may control to display a virtual driving environment image.
- the movement of the autonomous vehicle 1 may include at least one of speed, acceleration, deceleration, roll, pitch, yaw, and a change amount thereof, of the autonomous vehicle, and the movement sensing device. 238 may sense at least one of an acceleration, a deceleration, a roll, a pitch, a yaw, and a change amount thereof of the autonomous vehicle 1.
- the motion sensing device 238 may sense a traveling speed, a position change, and a direction change of the autonomous vehicle 1.
- the motion sensing device 238 may sense a driving state or a stopped state of the autonomous vehicle 1.
- the windshield of the autonomous vehicle 1 may display a virtual driving environment so as to correspond to the movement control of the autonomous vehicle 1 by the control device 290.
- the controller 290 may control the movement of the autonomous vehicle 1
- the windshield of the autonomous vehicle 1 may display an image representing the virtual driving environment corresponding to the movement of the autonomous vehicle. have.
- the autonomous vehicle 1 may further include a reproducing apparatus, and the reproducing apparatus may reproduce the virtual driving environment according to the movement control of the autonomous vehicle 1 by the controller 290, and may drive the autonomous driving.
- the windshield of the vehicle 1 can display the playback result of the playback device.
- the controller 290 may control the movement of the autonomous vehicle 1
- the reproduction apparatus may reproduce an image representing the virtual driving environment corresponding to the movement of the autonomous vehicle, and the vehicle window may reproduce the image. Can be displayed.
- the virtual driving environment may be 3D graphic data
- the playback device may be a graphics processing unit (GPU).
- the processor 120 may control the display apparatus 110 to display virtual driving environment images corresponding to each of the points on the driving route of the autonomous vehicle 1 based on the movement of the autonomous vehicle 1. . While the display device 110 continuously displays the virtual driving environment images, when the motion sensing device 238 senses the stationary state of the autonomous vehicle 1, the processor 120 continuously displays the virtual driving environment images. Can be paused.
- the processor 120 may control an image change rate of the virtual driving environment images displayed on the display apparatus 110 based on the movement of the autonomous vehicle 1.
- the image change rate may refer to a temporal change rate of the virtual driving environment images displayed on the display apparatus 100. That is, the image change rate may be a speed at which the virtual driving environment images are developed in the display apparatus 110.
- the processor 120 may display an image of the virtual driving environment displayed on the display device 110 based on the sensed speed. It is possible to control the rate of image change between them.
- the processor 120 may control the deployment speed of the virtual driving images displayed on the display apparatus 110 faster than before, and the autonomous vehicle
- the processor 120 may control the development speed of the virtual driving environment images displayed on the display apparatus 110 to be slower than before.
- the processor 120 controls the deployment speed of the virtual driving environment images 1020 and 1030 to be displayed on the display device 110 faster.
- the passenger may be provided with a more realistic driving experience through the virtual driving environment images 1020 and 1030 that are rapidly developed.
- the processor 120 is displayed on each of the plurality of display apparatuses 110 based on the movement of the autonomous vehicle 1 sensed by the movement apparatus 238.
- An image change rate between the virtual driving environment images may be controlled. That is, for example, when the display device 110 is formed in each of the area of the front window, the area of the right side window, and the area of the left side window of the autonomous vehicle 1, the processor 120 autonomously travels right. Based on the movement of the driving vehicle 1, the image change rate of the virtual driving environment images displayed on the display device 110 formed in the area of the left side window and the virtual display displayed on the display device 110 formed in the area of the right side window The image change rate of the driving environment images may be controlled differently.
- the processor 120 may display a virtual driving environment that is displayed on the display device 110 formed in the area of the left side window pane when the autonomous vehicle 1 runs in a right turn.
- the deployment speed of the images may be controlled to be faster than the deployment speed of the virtual driving environment images displayed on the display apparatus 110 formed in the area of the right side window.
- the processor 120 may determine an area of the window window to display the virtual driving environment image among the areas of the plurality of window windows. According to an example, the processor 120 may determine an area of the window window to display the virtual driving environment image among the areas of the plurality of window windows based on the passenger's selection.
- the processor 120 may determine an area of the car window corresponding to the line of sight of the passenger among the areas of the plurality of car windows as an area of the car window to display the virtual driving environment image.
- the image sensor 228 of FIG. 2 may detect a driver's gaze, and the processor 120 may detect an area of the car window corresponding to the detected driver's gaze among the areas of the plurality of car windows, in a virtual driving environment. The area of the window to display the image may be determined.
- a virtual driving environment image may be displayed on an area of a vehicle window corresponding to the eyes of the occupant based on a predetermined occupant among the plurality of occupants. Can be determined by the area of the windshield.
- the processor 120 stops detecting the eyes of the occupants, and sets the area of the preset window as an area of the window to display the virtual driving environment image. You can decide.
- FIG. 16 is a diagram illustrating a UI for selecting an area of a window to display a virtual driving environment according to an exemplary embodiment.
- the processor 120 may provide a passenger with a UI 1610 that selects an area of the window window to display the virtual driving environment image among the areas of the plurality of window windows.
- the display apparatus 110 includes an area of the front window 401 of the autonomous vehicle 1, an area of the left window 403, an area of the right window 402, and a rear window (
- the processor 120 When formed in the area of 404 and in the area of the ceiling window 405, the processor 120 includes an area of the front window 401, an area of the left side window 402, an area of the right side window 402, a rear window 404.
- the passenger 16 may be provided with a UI 1610 that selects which area of the vehicle window and the area of the ceiling window 405 to display the virtual driving environment image. Accordingly, the occupant may select an area of the vehicle window to display the virtual driving environment through the UI 1610.
- FIG. 17 is a diagram illustrating an embodiment in which a virtual driving environment image is displayed on an area of a car window corresponding to a line of sight of a passenger.
- the processor 120 may determine the areas 401 and 403 of the car window corresponding to the line of sight of the occupant 1710 among the areas 401, 402, 403 and 405 of the plurality of car windows as the area of the car window to display the virtual driving environment image. More specifically, the image sensor 228 may detect the gaze of the occupant 1710, and the processor 120 may virtually identify the areas 401 and 403 of the windshield located within a specific angle range with the gaze direction of the occupant 1710. It may be determined as an area of a vehicle window to display the driving environment image.
- the processor 120 may determine the areas 401 and 402 of the car window corresponding to the gaze of the occupant 1710 as the area of the car window to display the virtual driving environment image. .
- the processor 120 may control the display apparatus 110 to display content selectable by the occupant.
- the content may be a content such as an image or a picture provided through the Internet or computer communication, or may be an image provided by the autonomous vehicle 1 itself.
- the processor 120 may provide a passenger to select a content to the passenger and control the display device 110 to display the content selected by the passenger.
- FIG. 18 is a diagram illustrating a UI for selecting content to display on a display apparatus, according to an exemplary embodiment.
- the processor 120 may provide a passenger with a UI 1810 for selecting content to be displayed on the display apparatus 110.
- the processor 120 may provide a passenger with a UI 1810 capable of selecting a YouTube, a movie library, or Netflix.
- the processor 120 may provide a passenger with a UI 1810 capable of selecting an image captured by the image sensor 228 installed in the vehicle window, and provide a passenger with a UI 1810 capable of selecting a virtual driving environment image. Can be provided to
- 19 is a diagram illustrating an embodiment of displaying a movie on a display device.
- the occupant 1910 may select a movie as content to be displayed on the display apparatus 110 through the UI 1810 of FIG. 18. Subsequently, the occupant 1910 may lie in the autonomous vehicle 1 and look at the area of the car window 405 corresponding to the ceiling.
- the processor 120 may control to display a movie on the display device 110 formed in the area of the window 405 corresponding to the ceiling of the occupant 1910.
- the processor 120 may determine whether a preset event occurs. When a preset event occurs, the processor 120 may provide the passenger with information associated with the preset event. According to an example, when a preset event occurs, the processor 120 may control an image of an actual driving environment associated with the event to be displayed on the display apparatus 110. That is, when a preset event occurs, the processor 120 may control the passenger of the autonomous vehicle 1 to view the actual driving environment corresponding to the event through the display device 110. When the preset event occurs while the display device 110 displays the virtual driving environment image, the processor 120 may control the display device 110 to display the changed driving image as an image of the actual driving environment associated with the event. Can be.
- the processor 120 may control the display apparatus 110 to simultaneously display the virtual driving environment image and the image of the actual driving environment associated with the event. That is, since the autonomous vehicle 1 displays the virtual driving environment image or the content through the display device 110 formed in the vehicle window, the occupant cannot see the actual driving environment around the autonomous vehicle 1. Information on the set event can be provided to the passenger separately.
- the preset event may be a situation in which the autonomous vehicle 1 is stopped for a preset time. For example, when the autonomous vehicle 1 is stopped for 30 seconds or more due to traffic jam, the processor 120 may determine that a predetermined event has occurred. Subsequently, the processor 120 may control the traffic congestion situation, which is an actual driving environment associated with a preset event, to be displayed on the display apparatus 110.
- the preset event may be a situation in which the weather around the autonomous vehicle 1 changes. For example, when the weather of the autonomous vehicle 1 changes from a clear weather to a rainy weather, the processor 120 may determine that a predetermined event has occurred. Subsequently, the processor 120 may control the display apparatus 110 to display a rainy image captured by the image sensor 228 which is an actual driving environment associated with a preset event.
- the preset event may be a situation in which a change in the physical state of the occupant of the autonomous vehicle 1 exists. For example, when the occupant changes from the waking state to the sleeping state, the processor 120 may determine that a predetermined event has occurred. More specifically, the image sensor 228 may photograph the occupant's eyes, and the processor 120 may close the occupant's eyes more than a reference ratio compared to the normal state, or when the occupant's eyes are closed for more than the reference time. The passenger may be determined to be in a sleep state at. Subsequently, the processor 120 may stop displaying the virtual driving environment image and turn off the internal lighting 245 of the autonomous vehicle 1 in order not to disturb the sleep of the occupant.
- 20 is a diagram illustrating a UI for setting an event, according to an exemplary embodiment.
- the processor 120 may provide a UI 2010 for setting an event for the occupant.
- the occupant is responsible for a number of events (e.g., the vehicle suddenly changes speed, the vehicle enters a freeway, the vehicle is located near a landmark, the vehicle arrives at its destination, the weather changes,
- the UI 2010 may preset whether to be provided with information about future events based on which event of a road condition is dangerous and a situation where an emergency vehicle is around. Therefore, when the event selected in the UI 2010 occurs, the processor 120 may provide information about the selected event to the occupant.
- FIG. 21 is a diagram for one embodiment of providing a passenger with information on an event when a predetermined event occurs.
- the processor 120 may control the virtual driving environment image 2110 to be displayed on the display device 110.
- the autonomous vehicle 1 may detect that a wild animal has suddenly appeared while driving, and the autonomous vehicle 1 may suddenly change speed. Subsequently, the processor 120 may determine that a predetermined event, which is a situation in which the speed suddenly changes, occurs. Subsequently, the processor 120 controls the display apparatus 110 displaying the virtual driving environment image 2110 to display an image 2120 of a wild animal, which is a real driving environment associated with a preset event, in a partial region. Can be.
- FIG. 22 is a diagram for one embodiment of providing a passenger with information about an event when a predetermined event occurs.
- the processor 120 may control the virtual driving environment image 2210 to be displayed on the display device 110.
- the autonomous vehicle 1 may recognize that the current location is around the landmark while driving, and the processor 120 may determine that a preset event, which is a situation that is located around the landmark, has occurred. Subsequently, the processor 120 may control the display apparatus 110 to change and display a landmark, which is an actual driving environment associated with a preset event, in the virtual driving environment image 2210 as a captured image 2220.
- the processor 120 may display a display device that displays a virtual driving environment image 2210.
- the passenger may view the actual driving environment 2220 through the transparently processed display device 110.
- FIG. 23 is a diagram for one embodiment of providing a passenger with information about an event when a predetermined event occurs.
- the processor 120 may control the virtual driving environment image 2310 to be displayed on the display device 110.
- the processor 120 may recognize that the surrounding weather is raining while the autonomous vehicle 1 is driving, and determine that a preset event has occurred. Subsequently, the processor 120 may provide information to the occupant via the sound output unit 282 that rain comes.
- 24 is a flowchart of a method of operating an autonomous vehicle, according to an exemplary embodiment.
- the method shown in FIG. 24 may be a method performed in time series by the autonomous vehicle 1 described in the drawings.
- the autonomous vehicle 1 may acquire a virtual driving environment image that replaces the actual driving environment of the autonomous vehicle 1.
- the autonomous vehicle 1 may generate a virtual driving environment image based on the information on the driving route from the current position of the autonomous vehicle 1 to the destination. More specifically, the autonomous vehicle 1 may obtain information about a driving route from the current position of the autonomous vehicle 1 to the destination, and reflect the driving route obtained in the preset virtual reality, thereby driving virtually. An environment image can be generated. In addition, the autonomous vehicle 1 may generate virtual driving environment images corresponding to each of the points based on the points on the driving route of the autonomous vehicle 1.
- the autonomous vehicle 1 may generate a virtual driving environment image based on an image of the actual driving environment of the autonomous vehicle 1.
- the autonomous vehicle 1 may acquire an image of the actual driving environment of the autonomous vehicle 1, and the autonomous vehicle 1 may generate a virtual driving environment image based on the acquired image of the actual driving environment. Can be generated. More specifically, the autonomous vehicle 1 may recognize a road image through an image of an actual driving environment, and generate a virtual driving environment image by reflecting the recognized road image in a virtual reality.
- the autonomous vehicle 1 may acquire a virtual driving environment image from an external network.
- the autonomous vehicle 1 is a moving image of the virtual driving environment corresponding to the operation of driving the autonomous driving vehicle 1, and the virtual driving environment image corresponding to the operation of driving the left or right turn of the autonomous vehicle 1, respectively, the video Can be obtained.
- the autonomous vehicle 1 may acquire a virtual driving environment image through the input device 260. More specifically, the input device 260 of the autonomous vehicle 1 may receive a selection of a virtual driving environment from the user, and acquire an image indicating the virtual driving environment selected from the user.
- the autonomous vehicle 1 may control the display device formed in the area of the window of the autonomous vehicle 1 to display the virtual driving environment image.
- the autonomous vehicle 1 may control the display device to continuously display virtual driving environment images corresponding to each of the points on the driving route of the autonomous vehicle 1.
- the autonomous vehicle 1 may include a virtual driving environment image corresponding to an operation in which the autonomous vehicle 1 travels straight, and a virtual driving environment image corresponding to an operation in which the autonomous vehicle 1 travels left or right.
- the display device can be controlled to play back as a moving picture.
- the windshield of the autonomous vehicle 1 may display the virtual driving environment selected by the user in s2410. That is, the vehicle window of the autonomous vehicle 1 may display an image representing the selected virtual driving environment.
- FIG. 25 is a detailed flowchart embodying step 2420 of FIG. 24.
- the autonomous vehicle 1 may sense a movement of the autonomous vehicle 1.
- the autonomous vehicle 1 may sense a traveling speed, a position change, and a direction change of the autonomous vehicle 1.
- the autonomous vehicle 1 may sense a driving state or a stationary state of the autonomous vehicle 1.
- the autonomous vehicle 1 may control the display device to display the virtual driving environment image based on the sensed movement.
- the autonomous vehicle 1 may control the display device to display virtual driving environment images corresponding to each of the points on the driving route of the autonomous vehicle 1 based on the sensed movement. While the display device continuously displays the virtual driving environment images, when the stationary state of the autonomous vehicle 1 is sensed, the autonomous vehicle 1 may pause the continuous display of the virtual driving environment images.
- the autonomous vehicle 1 may control an image change rate of the virtual driving environment images displayed on the display device based on the sensed movement.
- the image change rate may be a speed at which the virtual driving environment images are developed on the display device. Therefore, when the driving speed of the autonomous vehicle 1 is sensed, the autonomous vehicle 1 may control the rate of change of the image between the virtual driving environment images displayed on the display device based on the sensed speed.
- the autonomous vehicle 1 may control the display apparatus to reproduce a virtual driving environment image corresponding to the straight driving as a moving image.
- the autonomous vehicle 1 may control the display apparatus 110 to reproduce a virtual driving environment image corresponding to the left turn driving or the right turn driving as a moving image when the autonomous driving vehicle 1 travels left or right. .
- 26 is a detailed flowchart of a method of operating an autonomous vehicle according to an embodiment.
- the method shown in FIG. 26 may be a method performed in time series by the autonomous vehicle 1 described above in the figures.
- the autonomous vehicle 1 may acquire a virtual driving environment image that replaces the actual driving environment of the autonomous vehicle 1.
- Step 2610 may correspond to step 2410 of FIG. 24.
- the autonomous vehicle 1 may control the display device formed in the area of the window of the autonomous vehicle 1 to display the virtual driving environment image.
- Step 2620 may correspond to step 2420 of FIG. 24.
- the autonomous vehicle 1 may determine whether a preset event occurs.
- the preset event may be a situation in which a vehicle suddenly changes speed, a vehicle enters a highway, a vehicle is located near a landmark, a vehicle arrives at a destination, a weather is changed, and a surrounding
- the road condition may be at least one of a dangerous situation and an emergency vehicle in the vicinity.
- the autonomous vehicle 1 may control the passenger of the autonomous vehicle 1 to view the actual driving environment corresponding to the event through the display device.
- the autonomous vehicle 1 controls the display device to change and display the image of the actual driving environment associated with the event. can do.
- the autonomous vehicle 1 may control the display device to simultaneously display the virtual driving environment image and the image of the actual driving environment associated with the event.
- the device includes a processor, memory for storing and executing program data, permanent storage such as a disk drive, a communication port for communicating with an external device, a touch panel, a key, a button And a user interface device such as the like.
- Methods implemented by software modules or algorithms may be stored on a computer readable recording medium as computer readable codes or program instructions executable on the processor.
- the computer-readable recording medium may be a magnetic storage medium (eg, read-only memory (ROM), random-access memory (RAM), floppy disk, hard disk, etc.) and an optical reading medium (eg, CD-ROM). ) And DVD (Digital Versatile Disc).
- the computer readable recording medium can be distributed over network coupled computer systems so that the computer readable code is stored and executed in a distributed fashion.
- the medium is readable by the computer, stored in the memory, and can be executed by the processor.
- This embodiment can be represented by functional block configurations and various processing steps. Such functional blocks may be implemented in various numbers of hardware or / and software configurations that perform particular functions.
- an embodiment may include an integrated circuit configuration such as memory, processing, logic, look-up table, etc. that may execute various functions by the control of one or more microprocessors or other control devices. You can employ them.
- the present embodiment includes various algorithms implemented in C, C ++, Java (data structures, processes, routines or other combinations of programming constructs). It may be implemented in a programming or scripting language such as Java), an assembler, or the like.
- the functional aspects may be implemented with an algorithm running on one or more processors.
- the present embodiment may employ the prior art for electronic configuration, signal processing, and / or data processing.
- Terms such as “mechanism”, “element”, “means”, “configuration” can be used widely and are not limited to mechanical and physical configurations. The term may include the meaning of a series of routines of software in conjunction with a processor or the like.
- connection or connection members of the lines between the components shown in the drawings by way of example shows a functional connection and / or physical or circuit connections, in the actual device replaceable or additional various functional connections, physical It may be represented as a connection, or circuit connections.
Landscapes
- Engineering & Computer Science (AREA)
- Transportation (AREA)
- Mechanical Engineering (AREA)
- Automation & Control Theory (AREA)
- Combustion & Propulsion (AREA)
- Chemical & Material Sciences (AREA)
- Physics & Mathematics (AREA)
- Mathematical Physics (AREA)
- Human Computer Interaction (AREA)
- Aviation & Aerospace Engineering (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- General Physics & Mathematics (AREA)
- Instrument Panels (AREA)
- Traffic Control Systems (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
La présente invention concerne un véhicule autonome et son procédé de fonctionnement, le véhicule autonome permettant l'affichage d'une image d'environnement de conduite virtuel, qui remplace l'environnement de conduite réel du véhicule autonome, au moyen d'un dispositif d'affichage formé sur une zone d'une fenêtre du véhicule autonome, en fournissant ainsi une expérience plus réaliste à un passager par rapport à l'environnement de conduite virtuel.
Priority Applications (3)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| EP19196268.7A EP3597468A1 (fr) | 2015-07-30 | 2016-07-29 | Véhicule autonome et son procédé de fonctionnement |
| EP16830877.3A EP3330151A4 (fr) | 2015-07-30 | 2016-07-29 | Véhicule autonome et son procédé de fonctionnement |
| US15/744,391 US20180211414A1 (en) | 2015-07-30 | 2016-07-29 | Autonomous vehicle and operation method thereof |
Applications Claiming Priority (6)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| US201562199179P | 2015-07-30 | 2015-07-30 | |
| US62/199,179 | 2015-07-30 | ||
| KR10-2016-0054107 | 2016-05-02 | ||
| KR1020160054107A KR20170015112A (ko) | 2015-07-30 | 2016-05-02 | 자율 주행 차량 및 그의 동작 방법 |
| KR10-2016-0095969 | 2016-07-28 | ||
| KR1020160095969A KR102637101B1 (ko) | 2015-07-30 | 2016-07-28 | 자율 주행 차량 및 그의 동작 방법 |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2017018844A1 true WO2017018844A1 (fr) | 2017-02-02 |
Family
ID=57885208
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/KR2016/008328 Ceased WO2017018844A1 (fr) | 2015-07-30 | 2016-07-29 | Véhicule autonome et son procédé de fonctionnement |
Country Status (1)
| Country | Link |
|---|---|
| WO (1) | WO2017018844A1 (fr) |
Cited By (7)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN109781431A (zh) * | 2018-12-07 | 2019-05-21 | 山东省科学院自动化研究所 | 基于混合现实的自动驾驶测试方法及系统 |
| EP3536534A1 (fr) | 2018-03-07 | 2019-09-11 | Dr. Ing. h.c. F. Porsche AG | Véhicule autonome |
| CN110803019A (zh) * | 2018-08-06 | 2020-02-18 | 株式会社小糸制作所 | 车辆用显示系统及车辆 |
| CN110849386A (zh) * | 2018-08-21 | 2020-02-28 | 三星电子株式会社 | 用于向车辆提供图像的方法及其电子设备 |
| CN111703301A (zh) * | 2020-06-18 | 2020-09-25 | 北京航迹科技有限公司 | 一种车窗内容显示方法、装置、电子设备及可读存储介质 |
| CN111976742A (zh) * | 2019-05-22 | 2020-11-24 | 丰田自动车株式会社 | 信息处理装置、自动驾驶车辆、信息处理方法和存储介质 |
| US20210350144A1 (en) * | 2020-05-11 | 2021-11-11 | GIST(Gwangju Institute of Science and Technology) | Mixed reality-based experience simulator |
Citations (5)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| KR20120072020A (ko) * | 2010-12-23 | 2012-07-03 | 한국전자통신연구원 | 자율주행 시스템의 주행정보 인식 방법 및 장치 |
| US20120256945A1 (en) * | 2008-06-17 | 2012-10-11 | Digigage Ltd. | System for altering virtual views |
| KR20140144919A (ko) * | 2013-06-12 | 2014-12-22 | 국민대학교산학협력단 | 가상현실에서 변동되는 장애물 정보를 반영한 무인 자동차의 자율 주행 시뮬레이션 시스템 |
| US20150100179A1 (en) * | 2013-10-03 | 2015-04-09 | Honda Motor Co., Ltd. | System and method for dynamic in-vehicle virtual reality |
| KR20150083929A (ko) * | 2012-11-30 | 2015-07-20 | 구글 인코포레이티드 | 자율주행으로의 진입 및 자율주행에서의 해제 |
-
2016
- 2016-07-29 WO PCT/KR2016/008328 patent/WO2017018844A1/fr not_active Ceased
Patent Citations (5)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20120256945A1 (en) * | 2008-06-17 | 2012-10-11 | Digigage Ltd. | System for altering virtual views |
| KR20120072020A (ko) * | 2010-12-23 | 2012-07-03 | 한국전자통신연구원 | 자율주행 시스템의 주행정보 인식 방법 및 장치 |
| KR20150083929A (ko) * | 2012-11-30 | 2015-07-20 | 구글 인코포레이티드 | 자율주행으로의 진입 및 자율주행에서의 해제 |
| KR20140144919A (ko) * | 2013-06-12 | 2014-12-22 | 국민대학교산학협력단 | 가상현실에서 변동되는 장애물 정보를 반영한 무인 자동차의 자율 주행 시뮬레이션 시스템 |
| US20150100179A1 (en) * | 2013-10-03 | 2015-04-09 | Honda Motor Co., Ltd. | System and method for dynamic in-vehicle virtual reality |
Cited By (11)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| EP3536534A1 (fr) | 2018-03-07 | 2019-09-11 | Dr. Ing. h.c. F. Porsche AG | Véhicule autonome |
| CN110803019A (zh) * | 2018-08-06 | 2020-02-18 | 株式会社小糸制作所 | 车辆用显示系统及车辆 |
| CN110803019B (zh) * | 2018-08-06 | 2023-05-05 | 株式会社小糸制作所 | 车辆用显示系统及车辆 |
| CN110849386A (zh) * | 2018-08-21 | 2020-02-28 | 三星电子株式会社 | 用于向车辆提供图像的方法及其电子设备 |
| CN109781431A (zh) * | 2018-12-07 | 2019-05-21 | 山东省科学院自动化研究所 | 基于混合现实的自动驾驶测试方法及系统 |
| CN109781431B (zh) * | 2018-12-07 | 2019-12-10 | 山东省科学院自动化研究所 | 基于混合现实的自动驾驶测试方法及系统 |
| CN111976742A (zh) * | 2019-05-22 | 2020-11-24 | 丰田自动车株式会社 | 信息处理装置、自动驾驶车辆、信息处理方法和存储介质 |
| CN111976742B (zh) * | 2019-05-22 | 2023-12-01 | 丰田自动车株式会社 | 信息处理装置、自动驾驶车辆、信息处理方法和存储介质 |
| US20210350144A1 (en) * | 2020-05-11 | 2021-11-11 | GIST(Gwangju Institute of Science and Technology) | Mixed reality-based experience simulator |
| US11734934B2 (en) * | 2020-05-11 | 2023-08-22 | GIST(Gwangju Institute of Science and Technology) | Mixed reality-based experience simulator |
| CN111703301A (zh) * | 2020-06-18 | 2020-09-25 | 北京航迹科技有限公司 | 一种车窗内容显示方法、装置、电子设备及可读存储介质 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| WO2017018844A1 (fr) | Véhicule autonome et son procédé de fonctionnement | |
| WO2018070717A1 (fr) | Procédé de fourniture d'une image d'obtention de visée à un véhicule, appareil électronique et support d'enregistrement lisible par ordinateur associé | |
| WO2019209057A1 (fr) | Procédé de détermination de position de véhicule et véhicule l'utilisant | |
| WO2019135537A1 (fr) | Dispositif électronique et procédé de correction d'emplacement de véhicule sur une carte | |
| WO2017217578A1 (fr) | Dispositif de commande de véhicule et procédé de commande associé | |
| WO2017039047A1 (fr) | Véhicule et procédé de commande associé | |
| WO2020145607A1 (fr) | Appareil électronique et procédé d'assistance à la conduite de vehicule | |
| WO2018092989A1 (fr) | Dispositif d'affichage et procédé de fonctionnement correspondant | |
| WO2022154299A1 (fr) | Dispositif de fourniture de plateforme de signalisation numérique, son procédé de fonctionnement, et système comprenant un dispositif de fourniture de plateforme de signalisation numérique | |
| WO2017022879A1 (fr) | Assistance de conduite de véhicule et véhicule la comprenant | |
| WO2020166749A1 (fr) | Procédé et système pour afficher des informations à l'aide d'un véhicule | |
| WO2020122281A1 (fr) | Dispositif d'affichage destiné à un véhicule | |
| WO2020235710A1 (fr) | Procédé de commande de véhicule autonome | |
| WO2020226343A1 (fr) | Appareil électronique et procédé d'aide à la conduite de véhicule | |
| WO2020226210A1 (fr) | Procédé de commande de véhicule autonome | |
| WO2017003013A1 (fr) | Appareil pour assistance à la conduite de véhicule, procédé de fonctionnement associé, et véhicule le comprenant | |
| WO2017171124A1 (fr) | Module externe et véhicule connecté à ce dernier | |
| WO2019208965A1 (fr) | Dispositif électronique et son procédé de fonctionnement | |
| WO2018143589A1 (fr) | Procédé et dispositif d'émission d'informations de voie | |
| WO2020189806A1 (fr) | Dispositif de commande de véhicule | |
| WO2021002487A1 (fr) | Dispositif de commande de véhicule et véhicule comprenant ledit dispositif | |
| WO2016021961A1 (fr) | Appareil de pilotage de lampe de tête de véhicule et véhicule comportant celui-ci | |
| WO2021002519A1 (fr) | Appareil pour fournir une annonce à un véhicule et procédé pour fournir une annonce à un véhicule | |
| WO2020204225A1 (fr) | Procédé de commande de véhicule | |
| WO2020159247A1 (fr) | Dispositif de sortie d'image |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 16830877 Country of ref document: EP Kind code of ref document: A1 |
|
| WWE | Wipo information: entry into national phase |
Ref document number: 15744391 Country of ref document: US |
|
| NENP | Non-entry into the national phase |
Ref country code: DE |
|
| WWE | Wipo information: entry into national phase |
Ref document number: 2016830877 Country of ref document: EP |