[go: up one dir, main page]

WO2018168531A1 - Dispositif d'affichage tête-haute - Google Patents

Dispositif d'affichage tête-haute Download PDF

Info

Publication number
WO2018168531A1
WO2018168531A1 PCT/JP2018/008080 JP2018008080W WO2018168531A1 WO 2018168531 A1 WO2018168531 A1 WO 2018168531A1 JP 2018008080 W JP2018008080 W JP 2018008080W WO 2018168531 A1 WO2018168531 A1 WO 2018168531A1
Authority
WO
WIPO (PCT)
Prior art keywords
vehicle
information
display device
display
head
Prior art date
Application number
PCT/JP2018/008080
Other languages
English (en)
Japanese (ja)
Inventor
望 下田
壮太 佐藤
Original Assignee
マクセル株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from JP2017046877A external-priority patent/JP2018149894A/ja
Priority claimed from JP2017063195A external-priority patent/JP2018165098A/ja
Application filed by マクセル株式会社 filed Critical マクセル株式会社
Publication of WO2018168531A1 publication Critical patent/WO2018168531A1/fr

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K35/00Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
    • B60K35/10Input arrangements, i.e. from user to vehicle, associated with vehicle functions or specially adapted therefor
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K35/00Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
    • B60K35/20Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor
    • B60K35/21Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor using visual output, e.g. blinking lights or matrix displays
    • B60K35/23Head-up displays [HUD]
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K35/00Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
    • B60K35/20Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor
    • B60K35/29Instruments characterised by the way in which information is handled, e.g. showing information on plural displays or prioritising information according to driving conditions
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K35/00Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
    • B60K35/85Arrangements for transferring vehicle- or driver-related data
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/09Arrangements for giving variable traffic instructions
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/16Anti-collision systems

Definitions

  • the present invention relates to a head-up display device suitable for being mounted on a vehicle or the like and displaying various video information.
  • a vehicle video display device (so-called head-up display device (hereinafter referred to as HUD)) that displays various information on a windshield of a vehicle has been put into practical use as one of technologies for displaying video images in real space.
  • HUD head-up display device
  • a driving operation of the vehicle can be supported.
  • Patent Document 1 discloses a navigation system that performs a virtual route search by transmitting vehicle position information in a relay format by inter-vehicle communication between a plurality of vehicles.
  • Patent Document 2 discloses a driving support device that estimates whether or not the driver changes the front-rear relationship with the surrounding vehicle from the change in the relative position of the host vehicle and the surrounding vehicle, and changes the content displayed on the display. It is disclosed.
  • the received information When displaying information received from another vehicle on the HUD, the received information may be displayed individually as it is, but it is necessary to display the information so that it is more easily communicated to the driver. In particular, in the case of a camera image picked up by a vehicle ahead, if displayed as it is during driving, the information may be too specific and dangerous (the driver will watch). In addition, when there is a lot of information to be displayed on the HUD when multiple pieces of information are acquired from the vehicle and other vehicles, the information to be displayed is selected in consideration of the importance of the information, and the display area is also safe. It is also necessary to make a decision in consideration of the characteristics.
  • An object of the present invention is to provide a head-up display device that displays a plurality of information from the own vehicle and other vehicles so that the driver can easily see and can immediately grasp the information.
  • a head-up display device acquires vehicle information (vehicle information) of the vehicle by a device installed in the vehicle, and other wireless transceivers installed in the vehicle Information related to the occurrence of an event from another vehicle (other vehicle information) is acquired by wirelessly communicating with a roadside device installed on the vehicle or road to transmit and receive information.
  • a control part determines the image
  • the control unit exceeds the upper limit of the number of display items that the number of other vehicle information displayed on the video display device is predetermined.
  • the other vehicle information to be displayed is determined in descending order of priority.
  • the head-up display device acquires vehicle information of the vehicle and video information around the vehicle by a device installed in the vehicle.
  • the control unit determines a video to be displayed on the video display device based on the vehicle information and surrounding video information. At this time, when an obstacle is detected in the surrounding video information, an obstacle area map indicating the positional relationship between the obstacle and the vehicle is displayed on the video display device.
  • the head-up display device acquires vehicle information of the vehicle and video information around the vehicle by a device installed in the vehicle.
  • the control unit determines a video to be displayed on the video display device based on the vehicle information and surrounding video information.
  • the vehicle is provided with another display tool different from the head-up display device so that the driver can visually recognize the external situation of the vehicle.
  • a video display device displays a notification area map indicating which of other display tools should be visually recognized.
  • a head-up display device that displays a plurality of information from the own vehicle and other vehicles so that the driver can easily see and can immediately grasp the information.
  • the schematic diagram explaining the outline
  • the block diagram which shows the structure of the control system of HUD.
  • FIG. 3 is a flowchart showing the basic operation of the HUD in the first embodiment.
  • the flowchart which shows the detail of a communication process (S213 of FIG. 12).
  • 14 is a flowchart showing details of display message determination processing (S304 in FIG. 13).
  • 15 is a flowchart showing details of display message selection processing (S408 in FIG. 14).
  • the figure which shows the obstruction area map employ
  • FIG. The figure which shows the notification method of an obstruction area.
  • 10 is a flowchart showing basic operations of the HUD in the second embodiment.
  • FIG. 21 is a flowchart showing details of display video change / decision processing (S233 in FIG. 20).
  • the figure which shows the example of an instrument panel display cooperated with HUD.
  • the figure which shows the notification area map employ
  • FIG. The figure which shows the example of the information classification marker displayed on a notification area map.
  • HUD head-up display device
  • FIG. 1 is a schematic diagram for explaining an outline of a HUD mounted on a vehicle.
  • the HUD 1 is mounted on the vehicle 2 and projects an image generated by the image display device 30 onto the windshield 3 (also referred to as a windshield) of the vehicle 2 through a mirror 52.
  • the image reflected by the windshield 3 enters the eyes of the driver, and the driver visually recognizes the image from the HUD.
  • the video to be displayed includes information related to driving (information from the own vehicle and other vehicles) and supports driving operation.
  • the HUD 1 includes a vehicle information acquisition unit 10 that acquires various types of vehicle information 4, a control unit 20 that generates video information to be displayed based on the vehicle information acquisition unit 10, a mirror drive unit 50 that drives a mirror 52, and a driver.
  • the vehicle information 4 includes vehicle-to-vehicle communication information and road-to-vehicle communication information received from other vehicles, in addition to speed information and gear information indicating the driving state of the vehicle.
  • the vehicle 2 includes an instrument panel 70 (abbreviated as instrument panel) that provides the vehicle information 4 to the driver, and can operate in cooperation with the HUD 1.
  • the member to be projected is not limited to the windshield 3 and may be another member such as a combiner as long as the image is projected.
  • the video display device 30 is configured by, for example, a projector having a backlight, an LCD (Liquid Crystal Display), or the like.
  • a self-luminous VFD (Vacuum Fluorescent Display) or the like may be used.
  • FIG. 2 is a schematic diagram showing an image display operation by the HUD.
  • a video for display is emitted from the video display device 30 installed at the lower part of the dashboard of the vehicle 2.
  • the image is reflected by the first mirror 51 and the second mirror 52 (for example, a concave mirror, a free-form surface mirror, a mirror having an optical axis asymmetric shape, etc.) and projected toward the windshield 3.
  • the first mirror 51 is fixed, and the second mirror 52 can be rotated by a mirror driving unit 50.
  • the rotatable second mirror 52 is simply referred to as “mirror 52”.
  • the image converged and projected from the mirror 52 is reflected by the windshield 3 and incident on the driver's eyes 5 to form an image on the retina so that the image can be visually recognized.
  • the driver sees the virtual image 9 present in front of the windshield 3.
  • the mirror drive unit 50 adjusts the display position of the virtual image 9 in accordance with the height of the driver's eyes. That is, by rotating the mirror 50 about the axis by the mirror driving unit 50, the position of the virtual image 9 can be moved in the vertical direction so that the virtual image 9 can be viewed at an easy-to-view position.
  • FIG. 3 is a block diagram showing the configuration of the control system of the HUD 1 in this embodiment.
  • Various types of vehicle information 4 are input to the vehicle information acquisition unit 10 and sent to the control unit 20.
  • An electronic control unit (ECU, Electronic Control Unit) 21 in the control unit 20 generates a video signal (display content) displayed by the HUD 1 and various control signals for the HUD 1 based on the input vehicle information 4.
  • the audio output unit 22 generates an audio signal to the speaker 60.
  • the nonvolatile memory 23 stores a program executed by the ECU 21, and the memory 24 stores video information and control information.
  • the video display device 30 includes a light source 31 such as an LED or a laser, an illumination optical system (not shown), and a display element 32 such as a liquid crystal element, and emits video light generated by the display element 32 toward a mirror 52. .
  • a light source 31 such as an LED or a laser
  • an illumination optical system not shown
  • a display element 32 such as a liquid crystal element
  • the light source adjustment unit 25 in the control unit 20 controls the light source 31 in the video display device 30.
  • the distortion correction unit 26 corrects distortion of the video signal to be displayed, and the display element driving unit 27 drives the display element 32 in the video display device 30 based on the corrected video signal.
  • the mirror adjustment unit 28 outputs a drive signal to the mirror drive unit 50 in order to adjust the position and orientation of the mirror 52.
  • the information processing / generation unit 33 processes the vehicle information 4 acquired from the own vehicle and other vehicles, and generates display video information (also referred to as a display message).
  • the display message determination unit 34 selects video information (display message) to be displayed from the plurality of acquired information, and determines the display position.
  • the road-to-vehicle communication wireless transmitter / receiver 113 performs road-to-vehicle communication between the vehicle 2 and roadside devices (roads, signs, signals, etc.) via the communication control unit 35.
  • the inter-vehicle communication wireless transceiver 114 performs inter-vehicle communication between the vehicle 2 and the surrounding vehicle (other vehicle) 2 ′ via the communication control unit 35.
  • the communication control part 35 can also determine the timing which performs a communication process, and can also limit a communication range and the vehicle of a communicating party.
  • the instrument panel 70 is a display that is mounted separately from the HUD 1, but can perform a display operation in cooperation with the HUD 1 under the control of the ECU 21.
  • the road-to-vehicle communication wireless transceiver 113 and the vehicle-to-vehicle communication wireless transceiver 114 are included in the control unit 20, but are mounted on the vehicle 2 outside the control unit 20. It may be done.
  • FIG. 4 is a diagram illustrating an example of a hardware configuration related to acquisition of the vehicle information 4.
  • the acquisition of the vehicle information 4 is performed by an information acquisition device such as various sensors installed in the vehicle 2 under the control of an electronic control unit (ECU) 21 in the control unit 20, for example.
  • ECU electronice control unit
  • the function of each device will be described. Note that it is not always necessary to include all these devices in order to execute the operation of the present embodiment, and other types of devices may be added as appropriate.
  • the vehicle speed sensor 101 acquires speed information of the vehicle 2.
  • the shift position sensor 102 acquires current gear information of the vehicle 2.
  • the steering wheel angle sensor 103 acquires steering wheel angle information.
  • the headlight sensor 104 acquires lamp lighting information related to On / Off of the headlight.
  • the illuminance sensor 105 and the chromaticity sensor 106 acquire external light information.
  • the distance measuring sensor 107 acquires distance information between the vehicle 2 and an external object.
  • the infrared sensor 108 acquires infrared information related to the presence / absence and distance of an object at a short distance of the vehicle 2.
  • the engine start sensor 109 detects engine On / Off information.
  • the acceleration sensor 110 and the gyro sensor 111 acquire acceleration gyro information including acceleration and angular velocity as information on the posture and behavior of the vehicle 2.
  • the temperature sensor 112 acquires temperature information inside and outside the vehicle.
  • the wireless transceiver 113 for road-to-vehicle communication transmits and receives information by road-to-vehicle communication between the vehicle 2 and the roadside machine, and the wireless transceiver 114 for vehicle-to-vehicle communication communicates between the vehicle 2 and other surrounding vehicles 2 ′.
  • the information around the vehicle 2 is acquired by transmitting and receiving information by inter-vehicle communication. This information includes event information (accident, obstacle detection, signal information, etc.) detected by the other vehicle 2 '.
  • event information information (accident, obstacle detection, signal information, etc.) detected by the other vehicle 2 '.
  • the communicable distance of the vehicle-to-vehicle communication wireless transmitter / receiver 114 is restricted, the distance between the roadside devices installed on the road using the road-to-vehicle communication wireless transmitter / receiver 113 can be reduced. Event information at a distant point can be acquired.
  • the camera (inside the vehicle) 115 and the camera (outside the vehicle) 116 respectively capture images of the inside and outside of the vehicle to obtain camera video information (inside / outside of the vehicle).
  • the camera (inside the vehicle) 115 captures, for example, the driver's posture, eye position, movement, and the like. By analyzing the obtained image, it is possible to acquire information such as the driver's fatigue status and eye height, for example.
  • the camera (outside the vehicle) 116 captures a situation around the vehicle 2 such as the front or rear. By analyzing the obtained image, for example, it is possible to grasp the presence of moving objects such as other vehicles and pedestrians in the vicinity, buildings and topography, road surface conditions (rain, snow, freezing, unevenness, etc.) It is.
  • a GPS (Global Positioning System) receiver 117 and a VICS (Vehicle Information and Communication System: registered trademark (hereinafter the same)) receiver 118 receive GPS signals, respectively. GPS information and VICS information obtained by receiving the VICS signal are acquired. It may be implemented as a part of a car navigation system that acquires and uses these pieces of information. Hereinafter, embodiments of the present invention will be described by dividing them into Examples 1 to 3.
  • FIG. 5 is a diagram for explaining an example of information transmission using inter-vehicle communication.
  • (A) shows transmission of event information.
  • event information information (accident, obstacle detection, road / road surface / signal information, etc.) with the leading vehicle A is detected, the information is notified to the following vehicle B by inter-vehicle communication.
  • the vehicle B displays the received event information on its own HUD and notifies the following vehicle C.
  • the vehicle C displays the received event information on its own HUD and notifies the subsequent vehicle D.
  • the preceding vehicle sequentially transmits information to the subsequent vehicle in a relay format, so that the driver of the subsequent vehicle can grasp the event that has occurred ahead before the vehicle arrives at the site. Therefore, it is possible to drive in preparation for the event occurrence, and it is possible to drive safely and safely.
  • FIG. (B) shows an example of HUD display in the vehicle C, which displays “event occurrence 90m ahead”.
  • the meaning of displaying the event occurrence is the same for each of the vehicles B to D, but the distance information displayed on the HUD of each vehicle is different because each vehicle has a different travel position (distance from the event occurrence site).
  • FIG. 6 is a table showing the types of event information.
  • Event information can be used for driving prediction and exists in a wide variety. For example, “accident” and “the state of the signal ahead” that are difficult to predict with ordinary road signs and the like are effective.
  • Such event information is acquired by a camera or various sensors provided in the preceding vehicle, and is acquired from control information (speed, running, stop, etc.) of the preceding vehicle and transmitted to the following vehicle.
  • An ID is assigned to each event type so that event information can be easily managed.
  • the received information may be displayed individually as it is, but it is preferable to display information that has been processed and summarized so that it can be more easily communicated to the user.
  • the information is too specific and it is dangerous to watch while driving (gazing).
  • the amount of information is large as it is, and the load on the communication line increases.
  • the information processing / generation unit 33 in the control unit 20 processes the received information into an image image such as a small number of characters or an icon as much as possible.
  • the processed event information is displayed on the HUD (video display device 30) of the vehicle. If the processed event information is transmitted to another vehicle via the inter-vehicle communication wireless transceiver 114, the load on the communication line can be reduced.
  • FIG. 7 is a diagram showing a display example of processed information.
  • (A) shows transmission of event information.
  • the vehicle A recognizes the state of the signal a and notifies the subsequent vehicle B, and the vehicle B displays the received information on the HUD.
  • the preceding vehicle A is a large vehicle, the signal a is hidden and cannot be seen in the succeeding vehicle B, so such HUD display is effective.
  • the information transmitted from the vehicle A is the recognition result of the signal a (signal color information) by the camera.
  • FIGS. B and (c) show examples of HUD display in the vehicle B.
  • the state of the signal a is processed and displayed in the character information 41 “20 meters ahead, signal: blue” together with distance information (distance from the vehicle B to the signal a).
  • an image (icon) 42 of a traffic light that is easy to understand sensuously is processed and displayed. While driving, it is easier for the driver to understand by displaying an image such as an icon rather than text information so that the driver does not hinder driving. Therefore, it is also effective to switch to the display of (b) when the vehicle is stopped and to the display of (c) while traveling according to the driving state.
  • FIG. 8 is a diagram illustrating an example of division of the HUD display area.
  • the display area is divided into three areas in the vertical direction, and the display areas are 1, 2, 3 from the top.
  • the central display area 2 is an area where the main part of the scenery (foreground) in front of the vehicle can be seen through the windshield 3, and the driver mainly pays attention to this area, and is the highest priority area.
  • display area 2 virtual images that call attention to objects (accidents, oncoming vehicles, pedestrians, etc.) and events (accidents, obstacles) detected by the camera / sensors of the vehicle are overlaid as necessary. Displayed as augmented reality (AR).
  • AR augmented reality
  • the upper and lower display areas 1 or 3 are areas in which there is no fear of hiding the above-mentioned object in the foreground (there is little), and there is little loss of visibility during driving. Therefore, it is suitable for displaying information notified from other vehicles, and event information from other vehicles is basically displayed in the display area 1 or the display area 3. Since the information received from other vehicles often has a time margin, it can be displayed early and viewed when there is a margin during driving. When urgent event information that suddenly occurs is received, it may be displayed in the display area 2 as an exception.
  • the above-described display area classification is an example, and information notified from other vehicles may be the display area 1 alone, and the vertical widths of the display areas 1 to 3 may be variable.
  • FIG. 9 is a diagram showing an example in which information is displayed in accordance with the region classification.
  • (A) is a figure which shows transmission of event information.
  • the vehicle A detects the event 1 and notifies the succeeding vehicle B of the event 1 information.
  • the vehicle B not only receives the event 1 information from the vehicle A, but also adds the event 2 information to the event 1 information and notifies the subsequent vehicle C when the own vehicle B detects the event 2.
  • the vehicle B displays both the event 1 information and the event 2 information on the HUD.
  • FIG. 8 shows a display example on the HUD in the vehicle B.
  • the event 1 information received from the other vehicle A is displayed in the display area 1 of FIG. 8 (reference numeral 43). Since the information of the event 2 detected by the own vehicle B has a high priority, it is displayed in the display area 2 of FIG. 8 (reference numeral 44). In the display area 2, a virtual image is superimposed on the object (pedestrian) detected by the vehicle B and displayed as AR (reference numeral 45). By dividing the display area in this way, it is possible to display at a position that is easy for the driver to visually recognize according to the priority and urgency of the information.
  • a management table for managing the received event information is created, and information to be displayed is determined with reference to the management table. That is, an upper limit is set in advance for the number of display items, and display priority (weighting) is set in accordance with the display contents, and display is performed within the upper limit number in order of priority. For example, priority is given to events with higher urgency or events occurring closer. Similarly, for information to be transmitted to other vehicles, the upper limit of the number of transmissions and weighting are performed.
  • FIG. 10 is a diagram showing an example of the reception information management table.
  • the reception information management table every time event information is received, an event occurrence time, an event ID (type) (see FIG. 6), a distance from the own vehicle to the event occurrence position, a direction of the event occurrence position (the own vehicle Describe the transmission status and display status of event information.
  • the reception information management table is stored in the memory 24, and the display message determination unit 34 determines information to be displayed with reference to this.
  • the event information described in the management table is deleted from the management table because the event information becomes unnecessary after the vehicle has passed the event occurrence point and has already been transmitted to another vehicle. .
  • FIG. 11 is a diagram illustrating an example of a priority determination table in event display.
  • (A) is a weighting table for each event, and weighting (here, ranks 1 to 5) is set for each event type.
  • (B) is an event attribute comparison table, and when there are a plurality of pieces of information having the same rank for each event weighting in (a), the priority is determined by comparing other attribute information.
  • This attribute information is described in the reception information management table of FIG.
  • the order of comparison first, the distance to the event occurrence position is compared, and priority is given to the order of distance. If the distances are the same, the procedure is to compare the direction of the event occurrence position. Furthermore, the event occurrence time and the event reception time are compared.
  • the tables (a) and (b) listed here are merely examples, and may be appropriately changed according to the driver's request, the distance traveled, the passage of time, and the like.
  • FIG. 12 is a flowchart showing the basic operation of the HUD in the first embodiment.
  • (A) shows an initial operation
  • (b) shows a normal operation.
  • the normal operation of (b) includes communication processing and display message determination processing.
  • the following processing is controlled by an electronic control unit (ECU) 21 of the control unit 20.
  • ECU electronice control unit
  • the vehicle information acquisition unit 10 acquires the vehicle information 4 (S202).
  • the vehicle-to-vehicle communication wireless transmitter / receiver 114 and the road-to-vehicle communication wireless transmitter / receiver 113 are set to the On state to prepare for communication with other vehicles and roadside devices (S203).
  • An appropriate brightness level is calculated from the external light information obtained by the illuminance sensor 105, and the brightness level of the light source 31 is set by controlling the light source adjustment unit 25 (S204).
  • the information selected by the driver (for example, current vehicle speed information) is extracted from the acquired vehicle information 4 and the video to be displayed is determined (S205).
  • the distortion correction unit 26 corrects the image distortion generated in the projection optical system with respect to the display image (S206).
  • the display element drive unit 27 supplies a drive signal to the display element 32 (S207). It is determined whether or not a HUD display On signal has been received (S208), and the system waits until an On signal is received (S209). When the On signal is received, the light source 31 of the video display device 30 is turned on, and the video projection display, that is, the normal operation of the HUD is started (S210).
  • the vehicle information 4 is continuously acquired via the vehicle information acquisition unit 10 (S211), and the brightness level adjustment process of the display image is performed according to the current external light information ( S212).
  • the vehicle-to-vehicle communication wireless transmitter / receiver 114 and the road-to-vehicle communication wireless transmitter / receiver 113 communicate with other vehicles and roadside devices under the control of the communication control unit 35 to transmit / receive event information (S213).
  • the information processing / generation unit 33 processes and generates information such as character information and icons suitable for HUD display based on information acquired through communication (S214).
  • the display message determination unit 34 determines the priority of them and determines a message to be displayed on the HUD (S215). Details of the communication process (S213) and the display message determination process (S215) will be described later.
  • the display element driving unit 27 updates the display image by controlling the display element (S216).
  • the mirror adjustment unit 28 and the mirror drive unit 50 adjust the mirror 52 (S217). It is determined whether or not a HUD display Off signal has been received (S218), and the processing from S211 is repeated until the Off signal is received.
  • the Off signal is received, the light source 31 of the video display device 30 is turned off, and the video projection display is terminated (S219).
  • FIG. 13 is a flowchart showing details of the communication process (S213 in FIG. 12).
  • the communication process (S300) is controlled by the communication control unit 35, communicates with other vehicles and roadside devices using the vehicle-to-vehicle communication wireless transceiver 114 and the road-to-vehicle communication wireless transceiver 113, and transmits and receives event information. I do. Here, it is assumed that information is periodically transmitted and received at predetermined time intervals.
  • event information reception processing is performed (S302).
  • information such as an event ID and an occurrence time is registered in the reception information management table (FIG. 10).
  • a process for determining a message to be displayed from the registered information is performed (S304). Details of the display message determination process will be described later.
  • the received event information is transmitted to other vehicles and roadside machines. For this reason, first, the process of determining the information transmission destination and the information to be transmitted is performed (S305), and the information transmission process is performed (S306). After transmission, the transmission state and display state of the reception information management table are updated (S307), and the communication process is terminated (S308).
  • communication is periodically performed at predetermined time intervals.
  • an information acquisition request may be manually issued to another vehicle, or information may be transmitted when an acquisition request is received from another vehicle. And as a result of information acquisition, if there is new information or information update, it is transmitted to other vehicles, and if not, it is not transmitted.
  • emergency information such as the occurrence of an accident and the presence of obstacles
  • communication is started when triggered by detection by a camera or sensor.
  • road-to-vehicle communication at the timing when an information holding roadside device installed on the road is found, the information held by itself is deposited and the information held in the roadside device is acquired.
  • FIG. 14 is a flowchart showing details of the display message determination process (S304 in FIG. 13).
  • the display message determination process (S400) is executed by the display message determination unit 34.
  • the upper limit (Nmax) of the number of messages that can be displayed on the HUD is acquired from the memory 24. Referring to the reception information management table (FIG. 10), the number N of event information received is confirmed (S402). It is determined whether there is information detected by the vehicle (S403), and if it exists, it is determined to display the information detected by the vehicle (S404). In that case, the upper limit of the number of messages that can be displayed is recalculated (S405). Simply, the number of information detected by the vehicle may be subtracted from the upper limit (Nmax) acquired in S401, but the calculation is performed in consideration of the conditions of the display area described in FIGS.
  • FIG. 15 is a flowchart showing details of the display message selection process (S408 in FIG. 14).
  • the display message selection process (S500) is executed by the display message determination unit 34 when the number N of received information exceeds the displayable upper limit Nmax.
  • the weighted ranks of the received event information are compared, and the priority is determined from the highest rank (S502). It is determined whether or not a plurality of pieces of event information having the same weight (rank) exist (S503). If a plurality of pieces of event information exist, the attribute information in the reception information management table (FIG. 10) is referred to (S504). Then, the priority is determined according to the rules described in the event attribute comparison table (FIG. 11B).
  • the distances to the event occurrence positions are compared, and priorities are given in order from the closest (S505). It is determined whether or not there are a plurality of event information of the same distance (S506). If there are a plurality of event information, the direction of event occurrence is compared, and priorities are given in the order closer to the traveling direction (S507). When there are a plurality of events in the same direction (Yes in S508), the event occurrence times are compared, and priorities are given in the order of the new times (S509). When there are a plurality of events having the same occurrence time (Yes in S510), a priority is given to an event having an earlier reception time (S511).
  • the event information with the upper limit Nmax is selected in order of priority and used as the current display message (S512). This completes the display message selection process (S513).
  • the event information received from the other vehicle is displayed in HUD in the order of predetermined priority, so that it is easy to see for the driver and information with high priority can be immediately transmitted.
  • the present invention is not limited to the above-described embodiments, and various forms as described below are possible.
  • Embodiment 2 describes a configuration in which the positional relationship between an obstacle and the vehicle is displayed in an easy-to-understand manner when images from a plurality of outside-vehicle cameras 116 are displayed on the HUD.
  • a configuration in which the positional relationship between the obstacle and the own vehicle is displayed in an easy-to-understand manner when the vehicle outside camera 116 finds an obstacle will be described.
  • an image obtained by performing image processing on the video from the outside camera 116 as necessary and displaying it on the display is referred to as a “camera image”.
  • the camera image including obstacles is displayed on the HUD as a pop-up window.
  • the video from the camera or the like is displayed at a predetermined position on the left and right of the HUD screen, for example, so as not to disturb the actual scene (the scenery in front) that the driver should originally see.
  • the display position is determined, it is impossible to immediately grasp where the displayed image such as a camera (including an obstacle) is a sensed image around the host vehicle.
  • an obstacle area map and an obstacle marker indicating the area around the host vehicle where the obstacle is present are displayed.
  • FIG. 16 is a diagram illustrating an obstacle area map employed in the second embodiment.
  • each area is indicated by a circled number, but in the following description, a number with a parenthesis corresponding thereto is used.
  • (A) shows the example of a division
  • an obstacle another vehicle, a pedestrian, etc.
  • the driver is notified. Furthermore, the camera image is displayed on the HUD as necessary.
  • the area in front of the host vehicle (indicated by AR) is an area for AR display with a virtual image superimposed on the foreground by HUD.
  • FIG. (B) shows an example of the obstacle area map 81 displayed on the screen 80 of the HUD.
  • the obstacle area map 81 is abstracted in correspondence with the section of the surrounding area of the own vehicle (a), and has a design in which the periphery is divided into nine areas around the own vehicle. Then, on the obstacle area map 81, the position where the obstacle exists is displayed. Therefore, the driver can intuitively grasp where the obstacle exists in the area around the own vehicle by looking at the obstacle area map 81.
  • the display position of the obstacle area map 81 in the HUD screen should be displayed at the end of the screen 80 during normal times, and moved to the center of the screen 80 during dangerous areas such as intersections and during low speed driving.
  • the display size of the obstacle area map 81 can be enlarged and reduced, and is preferably reduced during normal times and enlarged when an obstacle is detected.
  • (C) shows specific examples of obstacles in each area.
  • the obstacle area map 81 notifies the position of the obstacle.
  • the driver can grasp the presence of an obstacle in the areas (3) and (4) by looking at the display of the obstacle area map 81, and can perform an operation of confirming the rear vehicle with the side mirror.
  • FIG. 17 is a diagram showing a notification method of an obstacle area. Two methods for telling in which area an obstacle exists will be described using the obstacle area map 81 shown in FIG.
  • (A) is a method of displaying an obstacle area with emphasis over other areas (reference numeral 82). For example, the corresponding area (9) is illuminated or blinked.
  • (B) is a method of displaying the obstacle marker 83 in the corresponding area. For example, the obstacle marker 83 indicated by a triangle is displayed in the corresponding area (5). Both of the two notification methods shown here may be used, or one of them may be used.
  • the driver can immediately grasp where the obstacle is located in the area around the own vehicle, and can move to the obstacle checking operation by visual observation or the like.
  • the driver can immediately grasp where the obstacle is located in the area around the own vehicle, and can move to the obstacle checking operation by visual observation or the like.
  • an image of a camera including an obstacle is displayed on the HUD, it is possible to immediately grasp which area the image of the displayed camera image is.
  • FIG. 18 is a diagram showing an example (display example 1) of displaying a camera image and an obstacle area map on the HUD.
  • an image 84 such as a camera is displayed on the left side of the HUD screen 80.
  • This image is an image taken by the outside camera 116 (in this case, a wide-angle near-field camera) and shows a crossing person who is an obstacle.
  • An obstacle area map 81 is displayed at the lower center of the screen 80, and an obstacle marker 83 is displayed in the area where the obstacle exists. Thereby, it is possible to notify the driver that the obstacle (crossing person) shown in the video 84 such as the camera exists in the left front area (area (1) in FIG. 16).
  • a plurality of obstacles may be detected by sensing each area outside the vehicle with a plurality of outside cameras 116.
  • displaying all detected obstacles in the obstacle area map 81 unnecessarily increases the movement of the driver's viewpoint. Therefore, it is preferable to limit the sensing area to be displayed according to the driving situation.
  • FIG. 19 is an example of a priority table of sensing areas.
  • This table defines which area should be preferentially sensed and displayed according to the driving situation.
  • This table is stored in the nonvolatile memory 23 or the like. For example, when the driving situation is a left turn, priority is given to the sensing of the left area (3) that should be watched most in the current driving, followed by the sensing of the front (wide area) area (2). That is, if there is an obstacle in the area (3), it is displayed, and if there is no obstacle in the area (3), the obstacle in the area (2) is displayed.
  • the priority area is similarly determined. If the driving situation changes, the priority sensing area, that is, the display target is also switched.
  • the determination of which display target is to be made is determined not only by the presence or absence of an obstacle, but also by taking into account the danger of the obstacle. Specifically, the degree of risk is calculated based on the relative distance and relative speed from the vehicle to the obstacle. When the danger level reaches a certain level, it is determined as a display target. The position of the obstacle determined as the display target is displayed on the obstacle area map 81. In addition, a camera image including an obstacle is displayed as necessary. In addition, about the obstacle detected in the front (far) area (1) of FIG. 19, AR display is performed by superimposing a virtual image that emphasizes the obstacle in the foreground.
  • FIG. 20 is a flowchart showing the basic operation of the HUD in the second embodiment.
  • (A) shows an initial operation
  • (b) shows a normal operation.
  • These flowcharts are based on the basic operation of the first embodiment (FIG. 12).
  • the difference from FIG. 12 is that display video change determination processing and instrument panel cooperation processing are performed. Only the differences will be described here.
  • the setting for displaying the HUD 1 in cooperation with the instrument panel (instrument panel) 70 is performed. Specifically, the cooperation function for the instrument panel 70 is validated by the control unit 20 (ECU 21) of FIG.
  • FIG. 21 is a flowchart showing details of the display video change / decision process (S233 in FIG. 20).
  • the display image change / determination process (S600) is repeatedly executed by the ECU 21 and the display message determination unit 34 at predetermined time intervals.
  • ECU21 determines the present driving condition from the vehicle information 4 acquired via the vehicle information acquisition part 10 (S601). Specifically, the current running state (left turn, right turn, straight forward, reverse drive, etc.) is determined from sensor information such as the vehicle speed sensor 101, the shift position sensor 102, and the steering wheel angle sensor 103. Referring to the sensing area priority table of FIG. 19, obstacles are detected in the order of the sensing areas that are given priority according to the current driving situation (S602). When an obstacle is detected in the corresponding area, the level of risk is determined.
  • both the emphasis display 82 and the obstacle marker 83 are displayed on the obstacle area map 81, but only one of them may be displayed depending on the level of danger. Further, the display content when no obstacle is present can be appropriately selected by the driver.
  • FIG. 22 is a diagram showing an example of additional display at the center of the obstacle area map 81.
  • the center of the obstacle area map 81 is an area indicating the position of the own vehicle, and is an empty area where no obstacle is displayed. Therefore, it is possible to display other additional information related to the driving operation using this empty area.
  • the additional information for example, information such as an ADAS (Advanced Driver Assistance System) function, a fuel gauge, and a speedometer is simply displayed using the additional information marker 85.
  • ADAS Advanced Driver Assistance System
  • FIG. 23 is a diagram showing an example of instrument panel display in cooperation with HUD (display example 2).
  • the driver's seat is provided with various instrument panels (instrument panels).
  • a case in which a camera image 84 ′ is displayed on the center monitor 71 in cooperation with the HUD screen 80 is shown.
  • the camera video 84 and the obstacle area map 81 are displayed on the HUD screen 80.
  • the camera video 84 is switched to the center monitor 71 and displayed.
  • an obstacle marker 83 is displayed on the obstacle area map 81 in the HUD to inform the driver of which area the camera image 84 ′ displayed on the center monitor 71 belongs to.
  • the merits of instrument panel display Normally, the driver does not often see the center monitor 71 during driving, but the HUD can easily see the obstacle area map 81 with little viewpoint movement. Therefore, the obstacle marker 83 in the obstacle area map 81 notifies that there is an obstacle.
  • the role of the HUD is to make the driver aware that there is an obstacle and that the center monitor 71 displays an image of the obstacle. Then, the driver is asked to view the camera image 84 ′ displayed on the center monitor 71.
  • the video 84 'displayed on the monitor looks clearer than when the same video 84 is displayed on the HUD, for example. That is, there is a feature in that display suitable for each characteristic of the HUD and the center monitor 71 is performed.
  • the method of displaying the camera image on the instrument panel 70 other than the HUD such as the center monitor 71 is not limited to the case where the HUD resolution is low and not suitable for displaying the camera image. This is effective when there is an object and displaying the camera image on the HUD obstructs the forward view. Even in this case, the driver can immediately grasp where the obstacle exists.
  • FIG. 24 is a diagram showing an example of switching display contents in instrument panel cooperation processing. It shows how the display content of the instrument panel (for example, the center monitor 71) transitions according to changes in the driving situation by setting the instrument panel cooperation function to be effective.
  • An arbitrary screen one segment, music, map, etc. designated by the driver is displayed in the normal driving state (straight, straight at low speed, or stopped).
  • the display is switched to a video display such as a camera including the obstacle.
  • the display is switched to the video display of the camera in the priority area as in FIG. In this way, by making the HUD and the instrument panel cooperate, it is possible to perform a display that is easier to see for the driver.
  • the driver when an obstacle is discovered by the outside camera 116, the driver can immediately grasp the positional relationship between the obstacle and the own vehicle. Further, when an image such as a camera including an obstacle is displayed on the HUD, it is possible to immediately grasp where the image is sensed around the host vehicle.
  • Example 3 it is easy to understand on which display device the information that the driver should check is displayed on various display devices (HUD, instrument panel, rearview mirror / side mirror, etc.) provided in the vehicle. A configuration to be described will be described.
  • the digitization of vehicles has progressed, and it has become possible to obtain various information and provide it to the driver through various display devices including the HUD.
  • the driver has to look at various display devices from start to finish, and the frequency of the line of sight deviating from the foreground is increased, which is not preferable for safety. Therefore, a notification area map and an information type marker are displayed on the HUD to notify the driver of which display tool should be confirmed.
  • FIG. 25 is a diagram illustrating a notification area map employed in the third embodiment.
  • each area is indicated by a circled number, but in the following description, a number with a parenthesis corresponding thereto is used.
  • (A) shows the example of the various display tools with which a vehicle is equipped.
  • the windows (1) to (4) for directly viewing the surrounding area of the host vehicle, the side mirrors (5) (6) and the back for viewing through the mirror A mirror (7) is also included in the display. Therefore, in this example, the number of display tools is eight.
  • the window and the mirror are visually recognized by the driver from the outside of the vehicle, whereas the in-vehicle display (8) displays an external video (video from the camera etc.) from the outside camera 116. Further, the HUD 80 intuitively displays obstacle information by AR display.
  • FIG. (B) shows an example of the notification area map 91 displayed on the screen 80 of the HUD.
  • the notification area map 91 has a design in which the periphery is divided into seven areas and the in-vehicle display (8) is arranged at the center in order to indicate which area around the host vehicle can be visually recognized by the various display tools of (a). . Then, an information type marker described later is displayed in the area to be notified.
  • (C) indicates a display tool name corresponding to each area of the notification area map 91.
  • area (1) corresponds to the left front window
  • area (7) corresponds to the rearview mirror.
  • FIG. 26 is a diagram illustrating an example of the information type marker displayed on the notification area map.
  • An information type marker 92 indicating the type of information to be transmitted is superimposed and displayed on the corresponding area of the notification area map 91 shown in FIG.
  • the types of information include not only the approach of obstacles (other vehicles, pedestrians) but also various information such as the remaining amount of gasoline and incoming calls, and markers that are easy to distinguish are used. Then, the information type marker 92 is displayed in the display tool area where the information is displayed.
  • the left side of FIG. 26 shows two display examples.
  • the vehicle approaching marker 92a is displayed in the area (5) to prompt confirmation of the left side mirror
  • the overspeed marker 92b is displayed in the area (8) and confirmation of the in-vehicle display (speed meter) is prompted. is there.
  • the driver can intuitively understand what type of information is transmitted in the corresponding notification area (display tool).
  • the notification area map 91 As a modified example of the notification area map 91, a configuration in which the HUD screen is divided into a plurality of notification areas and information to be visually recognized by the driver for each notification area will be described.
  • FIG. 27 is a diagram showing a correspondence relationship between the area around the host vehicle and the HUD display area.
  • A shows the example of a division
  • the central area (5) is a short distance range in the front direction of the vehicle.
  • (b) divides the HUD screen 80 into nine display areas (notification areas).
  • the information for example, obstacle discovery which wants to tell a driver
  • the driver can immediately grasp what event is occurring in which direction as viewed from the own vehicle, and can confirm it with the corresponding display tool (viewing, mirror, vehicle-mounted display).
  • FIG. 28 is a diagram showing a display example of the HUD notification area.
  • (A) shows the situation of the surrounding area of the own vehicle, and an event occurs in each of the areas (2), (6), and (7).
  • (b) shows a HUD display example, and event occurrence information is displayed in the corresponding notification areas (2), (6), and (7).
  • the message character information
  • it may be displayed as an icon.
  • the third embodiment it is possible to immediately grasp on which indicator in the vehicle the information to be confirmed by the driver is displayed. Therefore, the driver's line of sight deviates from the foreground, and safety is improved.
  • HUD Head-up display device
  • 2 Vehicle
  • 3 Windshield
  • 4 Vehicle information
  • 9 Virtual image
  • 10 Vehicle information acquisition unit
  • 20 control unit
  • 21 Electronic control unit (ECU)
  • 30 video display device
  • 34 Display message determination unit
  • 35 Communication control unit
  • 50 Mirror drive unit
  • 52 Mirror
  • 60 Speaker
  • 70 Instrument panel (instrument panel)
  • 80 HUD screen
  • 81 Obstacle area map
  • 83 Obstacle marker
  • 84 Camera and other images
  • 85 Additional information marker
  • 91 Notification area map
  • 92 Information type marker
  • 113 Radio transceiver for road-to-vehicle communication
  • 114 Radio transmitter / receiver for inter-vehicle communication
  • 116 Camera (outside the vehicle).

Landscapes

  • Engineering & Computer Science (AREA)
  • Chemical & Material Sciences (AREA)
  • Combustion & Propulsion (AREA)
  • Transportation (AREA)
  • Mechanical Engineering (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Traffic Control Systems (AREA)
  • Instrument Panels (AREA)

Abstract

La présente invention concerne un dispositif d'affichage tête-haute qui affiche une pluralité d'éléments d'informations provenant d'un propre véhicule et d'un autre véhicule de façon à être facilement vus et immédiatement compris par un conducteur. Le dispositif d'affichage tête-haute 1 acquiert des informations de véhicule (informations de propre véhicule) par l'intermédiaire d'un dispositif installé dans un véhicule 2, et acquiert des informations (informations d'autre véhicule) concernant l'apparition d'un événement provenant de l'autre véhicule par l'intermédiaire d'un émetteur et d'un récepteur sans fil 113, 114 montés dans le véhicule. Une unité de commande 20 détermine, sur la base des informations de propre véhicule et des informations d'autre véhicule, une image à afficher sur un dispositif d'affichage d'image 30. À ce moment, une région dans laquelle l'image est affichée sur le dispositif d'affichage d'image 30 est réglée de telle sorte que ladite région est divisée en une région d'affichage d'informations de propre véhicule et une région d'affichage d'informations d'autre véhicule. De plus, on détermine une priorité qui affiche chaque type d'événement pour les informations d'autre véhicule, et lorsque le nombre d'éléments des informations d'autre véhicule à afficher sur le dispositif d'affichage d'image 30 dépasse une limite supérieure d'un nombre prédéfinie d'affichages, les informations d'autre véhicule à afficher sont déterminées dans un ordre de priorité décroissant.
PCT/JP2018/008080 2017-03-13 2018-03-02 Dispositif d'affichage tête-haute WO2018168531A1 (fr)

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
JP2017046877A JP2018149894A (ja) 2017-03-13 2017-03-13 ヘッドアップディスプレイ装置
JP2017-046877 2017-03-13
JP2017063195A JP2018165098A (ja) 2017-03-28 2017-03-28 ヘッドアップディスプレイ装置
JP2017-063195 2017-03-28

Publications (1)

Publication Number Publication Date
WO2018168531A1 true WO2018168531A1 (fr) 2018-09-20

Family

ID=63523062

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2018/008080 WO2018168531A1 (fr) 2017-03-13 2018-03-02 Dispositif d'affichage tête-haute

Country Status (1)

Country Link
WO (1) WO2018168531A1 (fr)

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110782530A (zh) * 2019-08-28 2020-02-11 腾讯科技(深圳)有限公司 自动驾驶仿真系统中车辆信息的展示方法及装置
WO2020075826A1 (fr) * 2018-10-11 2020-04-16 日本電信電話株式会社 Appareil, procédé de transmission de données et programme
CN111483540A (zh) * 2020-04-07 2020-08-04 合肥工业大学 一种电动驱动式敞篷代步装置
US20210056774A1 (en) * 2018-10-05 2021-02-25 Panasonic Intellectual Property Corporation Of America Information processing method and information processing system
JP2021033404A (ja) * 2019-08-19 2021-03-01 東京パワーテクノロジー株式会社 危険予知活動支援システム
CN112874533A (zh) * 2021-01-21 2021-06-01 江苏大学 一种基于v2x的汽车智能抬头显示系统及其方法
DE112023001204T5 (de) 2022-03-04 2025-01-16 Sony Group Corporation Informationsverarbeitungsvorrichtung, Informationsverarbeitungsverfahren und bewegter Körper

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH1148828A (ja) * 1997-07-31 1999-02-23 Isuzu Motors Ltd 車両周辺監視装置
JP2002092793A (ja) * 2000-09-20 2002-03-29 Denso Corp 車両用情報表示装置
US20060287825A1 (en) * 1999-06-25 2006-12-21 Fujitsu Ten Limited Vehicle drive assist system
US20070182527A1 (en) * 2006-01-31 2007-08-09 Steve Traylor Collision avoidance display system for vehicles
JP2008062762A (ja) * 2006-09-06 2008-03-21 Fujitsu Ten Ltd 運転支援装置および運転支援方法
JP2008213753A (ja) * 2007-03-07 2008-09-18 Toyota Motor Corp 表示制御装置
JP2009009320A (ja) * 2007-06-27 2009-01-15 Toyota Motor Corp 車両用警報表示装置
JP2012030611A (ja) * 2010-07-28 2012-02-16 Denso Corp 車両用表示装置
JP2014099078A (ja) * 2012-11-15 2014-05-29 Toyota Motor Corp 運転支援装置及び運転支援方法
JP2016105580A (ja) * 2014-09-19 2016-06-09 メクラ・ラング・ノース・アメリカ,エルエルシー 車両、特に商用車における表示システムおよび表示方法

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH1148828A (ja) * 1997-07-31 1999-02-23 Isuzu Motors Ltd 車両周辺監視装置
US20060287825A1 (en) * 1999-06-25 2006-12-21 Fujitsu Ten Limited Vehicle drive assist system
JP2002092793A (ja) * 2000-09-20 2002-03-29 Denso Corp 車両用情報表示装置
US20070182527A1 (en) * 2006-01-31 2007-08-09 Steve Traylor Collision avoidance display system for vehicles
JP2008062762A (ja) * 2006-09-06 2008-03-21 Fujitsu Ten Ltd 運転支援装置および運転支援方法
JP2008213753A (ja) * 2007-03-07 2008-09-18 Toyota Motor Corp 表示制御装置
JP2009009320A (ja) * 2007-06-27 2009-01-15 Toyota Motor Corp 車両用警報表示装置
JP2012030611A (ja) * 2010-07-28 2012-02-16 Denso Corp 車両用表示装置
JP2014099078A (ja) * 2012-11-15 2014-05-29 Toyota Motor Corp 運転支援装置及び運転支援方法
JP2016105580A (ja) * 2014-09-19 2016-06-09 メクラ・ラング・ノース・アメリカ,エルエルシー 車両、特に商用車における表示システムおよび表示方法

Cited By (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20210056774A1 (en) * 2018-10-05 2021-02-25 Panasonic Intellectual Property Corporation Of America Information processing method and information processing system
US11869279B2 (en) * 2018-10-05 2024-01-09 Panasonic Intellectual Property Corporation Of America Information processing method and information processing system
WO2020075826A1 (fr) * 2018-10-11 2020-04-16 日本電信電話株式会社 Appareil, procédé de transmission de données et programme
JPWO2020075826A1 (ja) * 2018-10-11 2021-09-02 日本電信電話株式会社 機器、データ送信方法及びプログラム
JP2021033404A (ja) * 2019-08-19 2021-03-01 東京パワーテクノロジー株式会社 危険予知活動支援システム
JP7288823B2 (ja) 2019-08-19 2023-06-08 東京パワーテクノロジー株式会社 危険予知活動支援システム
CN110782530A (zh) * 2019-08-28 2020-02-11 腾讯科技(深圳)有限公司 自动驾驶仿真系统中车辆信息的展示方法及装置
CN110782530B (zh) * 2019-08-28 2022-07-22 腾讯科技(深圳)有限公司 自动驾驶仿真系统中车辆信息的展示方法及装置
CN111483540A (zh) * 2020-04-07 2020-08-04 合肥工业大学 一种电动驱动式敞篷代步装置
CN112874533A (zh) * 2021-01-21 2021-06-01 江苏大学 一种基于v2x的汽车智能抬头显示系统及其方法
DE112023001204T5 (de) 2022-03-04 2025-01-16 Sony Group Corporation Informationsverarbeitungsvorrichtung, Informationsverarbeitungsverfahren und bewegter Körper

Similar Documents

Publication Publication Date Title
JP2018165098A (ja) ヘッドアップディスプレイ装置
WO2018168531A1 (fr) Dispositif d'affichage tête-haute
JP4305318B2 (ja) 車両情報表示システム
JP6658859B2 (ja) 情報提供装置
CN100412509C (zh) 机动车信息显示系统
JP4650349B2 (ja) 車両用表示システム
US20170161009A1 (en) Vehicular display device
JP6516642B2 (ja) 電子装置、画像表示方法および画像表示プログラム
JP2019113809A (ja) ヘッドアップディスプレイ装置
JP6107590B2 (ja) ヘッドアップディスプレイ装置
JP6744064B2 (ja) 自動車用画像表示システム
JP7111582B2 (ja) ヘッドアップディスプレイシステム
JP2019217790A (ja) ヘッドアップディスプレイ装置
JP2008001120A (ja) 車両用表示制御装置
JP2011218891A (ja) 車両用ディスプレイ装置
JP2017094882A (ja) 虚像生成システム、虚像生成方法及びコンピュータプログラム
US20210197863A1 (en) Vehicle control device, method, and program
JP2016112984A (ja) 車両用虚像表示システム、ヘッドアップディスプレイ
JP2016107947A (ja) 情報提供装置、情報提供方法及び情報提供用制御プログラム
JP2018144554A (ja) 車両用ヘッドアップディスプレイ装置
JP6485310B2 (ja) 情報提供システム、情報提供方法及びコンピュータプログラム
JP2015161565A (ja) ナビゲーション装置、ナビゲーション方法およびプログラム
JP2022138173A (ja) 車両用表示装置
JP2018149894A (ja) ヘッドアップディスプレイ装置
JP2018087852A (ja) 虚像表示装置

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18767640

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 18767640

Country of ref document: EP

Kind code of ref document: A1