[go: up one dir, main page]

WO2018139147A1 - Dispositif de commande, visiocasque, procédé de commande de dispositif de commande et programme de commande - Google Patents

Dispositif de commande, visiocasque, procédé de commande de dispositif de commande et programme de commande Download PDF

Info

Publication number
WO2018139147A1
WO2018139147A1 PCT/JP2017/046642 JP2017046642W WO2018139147A1 WO 2018139147 A1 WO2018139147 A1 WO 2018139147A1 JP 2017046642 W JP2017046642 W JP 2017046642W WO 2018139147 A1 WO2018139147 A1 WO 2018139147A1
Authority
WO
WIPO (PCT)
Prior art keywords
display
display device
area
image
instruction
Prior art date
Application number
PCT/JP2017/046642
Other languages
English (en)
Japanese (ja)
Inventor
嬋斌 倪
久雄 熊井
Original Assignee
シャープ株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by シャープ株式会社 filed Critical シャープ株式会社
Publication of WO2018139147A1 publication Critical patent/WO2018139147A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/0485Scrolling or panning
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G5/00Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G5/00Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
    • G09G5/36Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators characterised by the display of a graphic pattern, e.g. using an all-points-addressable [APA] memory

Definitions

  • One aspect of the present invention relates to a control device or the like that controls the operation of a display device that displays a partial image of a designated display target region in a content image region.
  • Patent Document 1 discloses a technology related to panoramic video distribution.
  • Patent Document 2 discloses a technique related to display of an omnidirectional image. These documents relate to a technique for causing a display device to display a partial image of a designated display target region for an image such as an omnidirectional image having an image region having a size that does not fit on one screen of the display device.
  • Japanese Patent Publication Japanese Unexamined Patent Application Publication No. 2015-173424 (Published October 1, 2015)” Japanese Patent Publication “Japanese Patent Laid-Open No. 2015-18296 (published Jan. 29, 2015)”
  • One aspect of the present invention provides a control device that can prevent an image in a predetermined image region from being overlooked when a user uses a display device that displays a partial image of a display target region among image regions of content. It aims to be realized.
  • a control device is a control device that controls the operation of a first display device that displays a partial image of a specified display target region among image regions of content.
  • An area specifying unit that specifies a first indicating area that is an image area instructed to be browsed by a user of the first display device, and the first indicating area specified by the area specifying unit is the display target.
  • a guidance unit that performs a guidance process that prompts the user to display an image of the first indication area when the image is not included in the area.
  • control method of the control device is an operation of the first display device that displays a partial image of a specified display target region among the image regions of the content.
  • a control method of the control device for controlling the region, the region specifying step for specifying the first indication region, which is the image region for which the user of the first display device is instructed to browse, and the above-mentioned specification in the region specifying step A guidance step of performing guidance processing that prompts the user to display an image of the first instruction area when the first instruction area is not included in the display target area.
  • an effect of preventing an image in an instruction area from being overlooked can be achieved.
  • FIG. 1 It is a block diagram which shows an example of the principal part structure of the display apparatus contained in the control system which concerns on Embodiment 1 of this invention. It is a figure which shows the structural example of the said control system. It is a figure which shows the relationship between an omnidirectional image and a display object area
  • Embodiment 1 An embodiment of the present invention will be described with reference to FIGS.
  • FIG. 1 is a block diagram illustrating an example of a main configuration of a display device (control device) 1A and a display device (control device) 1B included in a control system 3 according to the present embodiment.
  • the display devices 1A and 1B are head mounted displays (HMD) that are used by being mounted on the user's head.
  • HMD head mounted displays
  • a leader (user) uses the display device 1A to instruct a trainee (user) who uses the display device 1B.
  • the guidance is performed while displaying the same content on both the display device 1A and the display device 1B.
  • the display devices 1A and 1B are not limited to the HMD, and may be a personal computer equipped with a display, a television receiver, a smartphone, a tablet terminal, or the like.
  • the display device 1A is a device that displays content, and includes a control unit 10A, a sensor 18A, an input unit 19A, a communication unit 20A, a display unit 21A, and a storage unit 22A.
  • the control unit 10A controls and controls each unit of the display device 1A.
  • An instruction display processing unit (guidance unit, instruction area display processing unit) 14A, a synthesis unit 15A, an omnidirectional image drawing unit 16A, and an insertion processing unit 17A are included.
  • the line-of-sight direction specifying unit 11A determines the line-of-sight direction of the user of the display device 1A from the output value of the sensor 18A.
  • the sensor 18A detects the orientation of the display device 1A, that is, the orientation of the face of the user wearing the display device 1A (front direction).
  • the sensor 18A may be configured by a six-axis sensor that combines at least two of a three-axis gyro sensor, a three-axis acceleration sensor, a three-axis magnetic force sensor, and the like.
  • the line-of-sight direction specifying unit 11A sets the direction of the user's face specified from the output values of these sensors as the line-of-sight direction of the user.
  • the sensor 18A may detect the position of the user's black eye.
  • the line-of-sight direction specifying unit 11A specifies the line-of-sight direction from the position of the user's black eye.
  • the sensor 18A may include a sensor that detects the orientation of the user's face and a sensor that detects the position of the user's black eyes.
  • the identification of the line-of-sight direction can also be realized by a configuration other than the above.
  • a camera installed outside the display device 1A may be used instead of the sensor 18A. In this case, the display device 1A is provided with a light emitting device, which is blinked, and this state is photographed by the camera, and the position and orientation of the display device 1A can be detected from the image.
  • the line-of-sight direction can be determined by back-calculating from the light receiving time when the laser emitted from the external light emitting device is received by the light receiver provided in the display device 1A, the angle of each point received light, or the time difference.
  • the instruction accepting unit 12A accepts an instruction from the instructor to the instructor. More specifically, the instruction receiving unit 12A receives an instruction for the instructor, which is input by the instructor via the input unit 19A.
  • the instruction in the present embodiment is an instruction for designating an instruction area, which is an image area for instructing the instructor to browse in the content, and encouraging the user to view the instruction area. Therefore, it can be said that the instruction receiving unit 12A specifies the instruction area. As will be described later, the instruction receiving unit 12A may specify an instruction area designated in advance by referring to the storage unit 22A.
  • the instruction transmitting / receiving unit 13A transmits information related to the instruction (instructor's instruction) received by the instruction receiving unit 12A to the display device 1B via the communication unit 20A.
  • the instruction transmission / reception unit 13A causes the display device 1B to perform the guidance process in the above case.
  • the instruction transmission / reception unit 13A receives information related to the instruction (instructor's instruction) received by the display device 1B via the communication unit 20A.
  • the instruction display processing unit 14A displays information on the instruction (instructor's instruction) received by the instruction receiving unit 12A on the display unit 21A. In addition, the instruction display processing unit 14A displays information on the instruction (instructed by the instructor) received by the instruction transmitting / receiving unit 13A on the display unit 21A. Furthermore, the instruction display processing unit 14A determines whether or not the display areas (first instruction area and second instruction area) designated in the instruction of the instructor are included in the display target area. And when it determines with not containing, the guidance process which urges
  • the combining unit 15A causes the display unit 19 to display a partial image of the display target area in the omnidirectional image 23A.
  • the synthesis unit 15A displays the information related to the instruction of the instructor or the instructor superimposed on the partial image.
  • the synthesis unit 15A performs a process of superimposing and displaying an insertion image (related content) candidate to be described later on the partial image.
  • the omnidirectional image drawing unit 16A specifies the display target area in the omnidirectional image 23A from the line-of-sight direction specified by the line-of-sight direction specifying unit 11A. Then, the omnidirectional image drawing unit 16A causes the display unit 21A to display the partial image of the specified display target region in the image region of the omnidirectional image 23A via the synthesis unit 15A.
  • the omnidirectional image 23A may be a moving image or a still image.
  • the insertion processing unit 17A performs a process of displaying the insertion image related to the content displayed on the display device 1A on the display device 1B via the communication unit 20A of the display device 1A and the communication unit 20B of the display device 1B. .
  • the input unit 19A receives a user input operation, and outputs information indicating the content of the received input operation to the control unit 10A.
  • the input unit 19A may be a receiving unit that receives a signal indicating the content of a user input operation on a controller (not shown) from the controller.
  • the input unit 19A only needs to accept a pointing operation for designating a part of the image area of the content.
  • the communication unit 20A is for the display device 1A to communicate with another device (in this example, the display device 1B).
  • the communication unit 20A and the communication unit 20B of the display device 1B may communicate with each other via a full mesh (peer-to-peer) or via Ethernet (registered trademark).
  • the display unit 21A is a device (display device) that displays an image.
  • the display unit 21A may be a non-transmissive type or a transmissive type. When the transmissive display unit 21A is used, it is possible to provide the user with a mixed reality space in which the image displayed by the display unit 21A is superimposed on the visual field outside the display device 1 (real space).
  • the display unit 21A may be a display device externally attached to the display device 1A, or may be a normal flat panel display or the like.
  • the storage unit 22A stores various data used by the display device 1A.
  • the storage unit 22A stores an omnidirectional image 23A, instruction information 24A, and insertion information 25A.
  • the omnidirectional image 23A is an image obtained by imaging all directions from the imaging point.
  • the content displayed by the display device 1A is the omnidirectional image 23A.
  • the instruction information 24A is information indicating the content of the instruction and includes information indicating the instruction area.
  • the insertion information 25A is used to display a content displayed on the display device 1A, that is, an insertion image that is an image related to the omnidirectional image 23A.
  • the omnidirectional image 23A and the insertion image may be a moving image or a still image.
  • the display device 1B is a device that displays content, and includes a control unit 10B, a sensor 18B, an input unit 19B, a communication unit 20B, a display unit 21B, and a storage unit 22B, similar to the display device 1A. Since the sensor 18B, the input unit 19B, the communication unit 20B, and the display unit 21B have the same configurations as the sensor 18A, the input unit 19A, the communication unit 20A, and the display unit 21A, description thereof will be omitted.
  • the display devices 1A and 1B are simply referred to as the display device 1 when it is not necessary to distinguish between them.
  • the control unit 10B has the same configuration as the control unit 10A except that the control unit 10B does not include the insertion processing unit 17A. That is, the line-of-sight direction specifying unit 11B, the instruction receiving unit (region specifying unit, designation receiving unit) 12B, the instruction transmitting / receiving unit (guidance control unit) 13B, the instruction display processing unit (guidance unit, instruction region display processing unit) 14B, the combining unit 15B, the omnidirectional image drawing unit 16B has the same configuration as the line-of-sight direction specifying unit 11A, the instruction receiving unit 12A, the instruction transmitting / receiving unit 13A, the instruction display processing unit 14A, the combining unit 15A, and the omnidirectional image drawing unit 16A. is there.
  • the storage unit 22B has the same configuration as the storage unit 22A except that the insertion information 25A is not stored. That is, the omnidirectional image 23B is the same image (content) as the omnidirectional image 23A.
  • the instruction information 24B is information indicating the content of the instruction similarly to the instruction information 24A, and includes information indicating the instruction area.
  • the display device 1B does not include the insertion processing unit 17A, the instructor cannot display the insertion image on the display device 1A. That is, in the control system 3, the authority to display the inserted image is given only to the display device 1A on the instructor side. Since this authority is an authority given to a leader, it is called a leader authority.
  • the display device 1A and 1B included in the control system 3 or an instructor authority for displaying the insertion image on the user may be set. In this case, the display device 1B displays an insertion image on the display device 1A when the instructor authority is set for the own device or the user, while the instructor authority is set for the own device and the user. If not, the insertion image is not displayed on the display device 1A. The same applies to the display device 1A.
  • FIG. 2 is a diagram illustrating a configuration example of the control system 3.
  • the instructor's display device 1A is an HMD
  • the instructor's display device 1B is also an HMD.
  • the control system 3 shown in FIG. 1 also corresponds to this example.
  • a partial image of a designated display target area among the image areas of the content is displayed. That is, the display device 1A displays a partial image of a display target region corresponding to the gaze direction of the leader in the content image region, and the display device 1B displays the gaze direction of the instructor in the content image region.
  • a partial image of the display target area corresponding to is displayed.
  • the control system 3 may include a plurality of leader display devices 1A and a trainee display device 1B. Further, the content may be displayed two-dimensionally or may be displayed three-dimensionally. If the display devices 1A and 1B are configured to communicate with each other via a wide area network such as the Internet, the instructor can instruct a remote instructor.
  • the leader display device 1A is a two-dimensional display device
  • the leader display device 1B is an HMD.
  • the instructor's display device 1A may display a partial image of a specified display target area in the image area of the content, or may display the entire image area of the content ( (See FIG. 4 below).
  • the omnidirectional image is displayed on the two-dimensional display device
  • the partial image of the display target area of the omnidirectional image may be two-dimensionally mapped according to the screen size of the display device.
  • the display device 1B can be used for a purpose in which the instructor learns alone.
  • the display device 1B of the instructor is an HMD.
  • an image area to be noticed at the time of learning is set in advance, and this enables learning similar to the case where the instructor points out an image area to be noticed.
  • FIG. 3 is a diagram illustrating the relationship between the omnidirectional image and the display target area.
  • the omnidirectional image A1 is shown in a three-dimensional coordinate space defined by x, y, and z axes orthogonal to each other.
  • the omnidirectional image A1 forms an omnidirectional sphere that is a sphere of radius r.
  • the z-axis direction coincides with the vertical direction of the sensor 18A in the real space
  • the y-axis direction coincides with the front direction of the sensor 18A in the real space
  • the x-axis direction coincides with the left-right direction of the sensor 18A in the real space. I'm doing it.
  • the line-of-sight direction specifying unit 11A determines which direction the sensor 18A is facing from the output value of the sensor 18A. Since the sensor 18A is mounted on the display device 1A in a predetermined orientation, if the user wears the display device 1A in the correct orientation, the orientation of the sensor 18A can be regarded as the user's line-of-sight direction. Therefore, hereinafter, the direction of the sensor 18A will be described as the user's line-of-sight direction.
  • the line-of-sight direction identifying unit 11A rotates the line-of-sight direction around an azimuth angle (yaw) ⁇ ( ⁇ 180 ° ⁇ ⁇ ⁇ 180 °) that is a rotation angle around the vertical axis (z-axis) and a horizontal axis (x-axis). It can be expressed in combination with an angle of elevation (pitch) ⁇ ( ⁇ 90 ° ⁇ ⁇ ⁇ 90 °).
  • the omnidirectional image drawing unit 16A When the line-of-sight direction specifying unit 11A specifies the azimuth angle and elevation angle indicating the line-of-sight direction, the omnidirectional image drawing unit 16A includes a straight line extending from the user's viewpoint position Q in the direction indicated by the specified azimuth angle and elevation angle, An intersection point P with the sphere image A1 is obtained. Then, in the omnidirectional image A1, an area having a height h and a width w with the intersection P as the center is specified as the display target area A11. Then, the omnidirectional image drawing unit 16A causes the display unit 21 to display a portion in the display target area A11 of the omnidirectional image A1.
  • the display target area A11 changes in conjunction with the user's line-of-sight direction, and the image displayed on the display unit 21A also changes accordingly.
  • the display on the display device 1B is the same as described above.
  • the viewpoint position Q in the omnidirectional sphere is assumed to be stationary in order to simplify the description, but the viewpoint position in the omnidirectional sphere is linked with the movement of the user in the real space. Q may be moved.
  • FIG. 4 is a diagram illustrating content displayed on the display device 1.
  • a display target area A11 of the display device 1A of the instructor and a display target area A12 of the display device 1B of the instructor are illustrated.
  • the position on the omnidirectional image A1 includes an azimuth angle ⁇ ( ⁇ 180 ° ⁇ ⁇ ⁇ 180 °) and an elevation angle ⁇ ( ⁇ 90 ° ⁇ ⁇ ⁇ 90 °) which is a rotation angle around the horizontal axis (x axis).
  • the azimuth angle at the left end of the omnidirectional image A1 is ⁇ 180 °
  • the azimuth angle at the right end is 180 °
  • the elevation angle at the upper end is 90 °
  • the elevation angle at the lower end is ⁇ 90 °.
  • the display target areas A11 and A12 are both part of the omnidirectional image A1, but the ranges in the omnidirectional image A1 are different. That is, in this example, the leader and the trainee are looking in different directions.
  • the instructor operates the input unit 19A to instruct the instructor to browse within the display target area A11.
  • a position (indicated area) is specified.
  • the pointer M1 is displayed at the designated position on the display unit 21A of the display device 1A of the instructor.
  • the display unit 21B of the instructor's display device 1B displays an arrow M2 that prompts the user to turn his / her line of sight to the designated position.
  • An arrow M2 indicates the moving direction of the line of sight so that the position of the pointer M1 enters the display target area A12.
  • the instructor can move the line of sight in the right direction according to the arrow M2, and thereby capture the designated area with the pointer M1 in the field of view.
  • the instructor can prompt the instructor who has not seen the instruction area to see the image of the instruction area, so that the instruction effect can be enhanced.
  • the instructor simply designates the instruction area on the display device 1A, the arrow M2 is displayed on the display device 1B, so that the operation of the instructor is simple.
  • FIG. 5 is a diagram illustrating an example in which the indication area in the display target area A11 is designated by a rectangular area surrounding the object.
  • An area M3 in FIG. 5 is a rectangular area of a size that can accommodate one object in the display target area A11. Such a region M3 may be specified by the instructor by specifying a range via the input unit 19A. Further, the display device 1A may automatically detect an object in the display target area A11 and specify the area M3 including the object.
  • the display screen may be used as the instruction area. For example, it is possible to prompt the instructor to confirm the measured value by using the display screen of the display device that displays the measured value of the measuring device as the instruction area.
  • FIG. 6 is a diagram illustrating an example of screen transitions of the display device 1A on the instructor side and the display device 1B on the instructor side when displaying the guidance information.
  • the display device 1A on the instructor side displays the display target area A11
  • the display device 1B on the instructor side displays the display target area A12.
  • the image displayed in the display target area A11 is different from the image displayed in the display target area A12. That is, the instructor and the instructor are looking at different positions in the omnidirectional image.
  • the instructor points the tallest building in the display target area A11 by an input operation via the input unit 19A, and the pointer M1 is displayed in the display target area A11. Thereafter, the display content of the display device 1A on the instructor side does not change unless the instructor performs input operation or line-of-sight movement.
  • the instructor-side display device 1B displays an arrow M2 to prompt the instructor to move the line of sight to the right.
  • the instructor moves his / her line of sight according to the arrow M2, so that the building pointed to by the instructor enters the display target area A12, and the pointer M1 is displayed on the display device 1B on the instructor side. ing.
  • the instruction information 24 may be information as shown in FIG. 7, for example.
  • FIG. 7 is a diagram illustrating an example of the instruction information 24.
  • the instruction information 24 in FIG. 7 is associated with information of “field of view”, “instructor”, “instruction area (azimuth angle)”, “instruction area (elevation angle)”, “detected object”, and “time zone”.
  • the data is in table format.
  • View is information indicating an image displayed on the display device 1 of the instructor.
  • the “superimposed image” will be described in the second embodiment.
  • “Instructor” is information indicating the user who issued the instruction. Although details will be described later with reference to FIG. 11, the instruction can be given not only by the leader but also by the instructor.
  • “Instruction area (azimuth angle)” and “instruction area (elevation angle)” are information indicating the instruction area. That is, the area specified by the azimuth range (unit: “°”) indicated by “instruction area (azimuth angle)” and the elevation angle range (unit: “°”) indicated by “instruction area (elevation angle)” Is the designated area.
  • the instruction area for the instruction of the instructor X1 is a rectangular area having an azimuth angle of ⁇ 180 ° to ⁇ 130 ° and an elevation angle of 20 ° to 50 °, out of the image area of the omnidirectional image. It is.
  • Detected object is information indicating an object detected at the pointed position, that is, the designated area. The object detection is as described based on FIG.
  • Time zone is information indicating a time zone (a playback time zone in content playback) to which attention is given to the designated area.
  • the period during which pointing is performed may be used as it is as the “time zone”, or the predetermined time after the pointing is performed may be used as the “time zone”.
  • a period during which the object is displayed in the content may be set as a “time zone”.
  • the instruction display processing unit 14A may move the instruction area following the change in the display position of the object. Then, the instruction display processing unit 14A determines whether or not the instruction area after movement is included in the display target area, and may perform guidance processing when it is not included. Thereby, the moving object can be tracked. This processing may be performed by the instruction display processing unit 14B.
  • the insertion information 25 may be information as shown in FIG. 8, for example.
  • FIG. 8 is a diagram illustrating an example of the insertion information 25.
  • the insertion information 25 shown in FIG. 8A corresponds to the information of “content”, “instruction area (azimuth angle)”, “instruction area (elevation angle)”, “time zone”, and “insertion image”. It is attached table format data.
  • Content is information indicating the content on which the insertion image is superimposed.
  • the “superimposed image” will be described in the second embodiment.
  • “Instruction area (azimuth angle)”, “instruction area (elevation angle)”, and “time zone” are information indicating conditions for displaying an inserted image. More specifically, “indication area (azimuth angle)” and “instruction area (elevation angle)” indicate the conditions of pointing (designation of designation area), and “time zone” is the timing of pointing (designation of designation area). The conditions are shown. For example, when “omnidirectional image A2” is displayed as the content, and the pointing area is specified within the range of azimuth angle of ⁇ 70 ° to ⁇ 10 ° and elevation angle of 20 ° to 45 °, The inserted image becomes D2.
  • the insertion image to be displayed Becomes D3.
  • “instruction area (azimuth angle)” and “instruction area (elevation angle)” are associated with the contents of the corresponding inserted image.
  • an image captured by an imaging device different from the imaging device that captured the omnidirectional image in the region indicated by the “instruction region (azimuth angle)” and the “instruction region (elevation angle)” is an insertion image of the region. It is good.
  • any of the content areas of the insertion image (insertion image D1) in which neither “instruction area (azimuth angle)”, “instruction area (elevation angle)”, nor “time zone” is associated is indicated area. Regardless of whether it is specified as or as the specified timing, it becomes a display target. Further, an insertion image (object E1) in which both “instruction area (azimuth angle)” and “instruction area (elevation angle)” and “time zone” are associated with each other is a condition of both the designated position and the designated time zone. It becomes a display object when satisfying.
  • “Inserted image” is information indicating an inserted image to be displayed superimposed on “content”.
  • the insertion image may be stored in advance in the storage unit 22 or the like, or a part of the content may be used as the insertion image.
  • an object displayed in the instruction area may be detected and used as an insertion image. Thereby, this object can be tracked by the inserted image.
  • the information shown in (b) of FIG. 8 is data in a table format in which information of “insert image”, “instruction area (azimuth angle)”, “instruction area (elevation angle)”, and “time zone” is associated. It is.
  • “Instruction area (azimuth angle)” and “instruction area (elevation angle)” are information indicating an area to be displayed as an insertion image among the image areas of “insertion image”.
  • the insertion image D2 is a display target in the entire image region where the azimuth angle is ⁇ 70 ° to ⁇ 10 ° and the elevation angle is 20 ° to 45 °.
  • “Time zone” is information indicating a time zone to be displayed as an inserted image among the playback time zones of the “inserted image” that is a moving image.
  • the insertion image D3 is subject to playback in the playback time zone from 00:03:00 to 00:05:00.
  • the object displayed as a part of the omnidirectional image during the reproduction time period indicated by “time period” is a target to be displayed as the inserted image.
  • the object E1 is a target to be displayed as an insertion image for the object E1 that was displayed in the time zone from 00:00 to 00:05:00 in the content (the omnidirectional image).
  • FIG. 9 is a diagram illustrating an example of screen transition of the display device 1A on the instructor side and the display device 1B on the instructor side when displaying the insertion image.
  • the display device 1A on the instructor side displays the display target area A11
  • the display device 1B on the instructor side displays the display target area A12.
  • the image displayed in the display target area A11 is the same as the image displayed in the display target area A12. That is, the instructor and the instructor are looking at different positions in the omnidirectional image.
  • the display device 1A on the instructor side and the display device 1B on the instructor side both display the pointer M1. Further, the display device 1A on the instructor side displays the insertion images D1 and D2 as selection candidates.
  • the insertion image to be displayed is specified from the insertion information 25 (see FIG. 8A). The user can select the insertion image D1 or D2 by an input operation via the input unit 19A. In the illustrated example, the insertion image D1 is selected.
  • the display device 1A on the instructor side and the display device 1B on the instructor side both display the selected insertion image D1.
  • the insertion image D1 is also displayed on the instructor side, but may be displayed only on the instructor side. Further, the insertion image D1 may be superimposed on the image of the display target area A11 or A12 and displayed together with the image.
  • the insertion image corresponding to the position of the instruction area (the position where the pointer M1 is displayed) or the like is displayed, and when the insertion image is selected, the insertion is performed.
  • the image is displayed on the display device 1B on the instructor side. Therefore, the leader can point the point of interest with the pointer M1, and then can easily and quickly present the insertion image regarding the point of interest to the instructor.
  • FIG. 10 is a diagram illustrating an example of screen transitions of the display device 1A on the instructor side and the display device 1B on the instructor side when the instructor does not follow the guidance.
  • the display device 1A on the instructor side displays the display target area A11
  • the display device 1B on the instructor side displays the display target area A12.
  • the image displayed in the display target area A11 is different from the image displayed in the display target area A12. That is, the instructor and the instructor are looking at different positions in the omnidirectional image.
  • the display device 1A on the instructor side displays a pointer M1 and insertion images D1 and D2.
  • an arrow M2 is displayed on the display device 1B on the instructor side.
  • the instructor-side display device 1B reduces the image displayed in the instructor-side display target area A11 to the instructor-side image.
  • F1 is displayed. Thereby, it is possible to make the instructor recognize what image is displayed on the instructor side and at what position the pointer M1 is displayed.
  • the instructor-side image F1 only needs to include at least an instruction area.
  • the instructor-side image F1 is an image obtained by cutting out a portion where the pointer M1 is displayed or a predetermined range around the portion and the portion from the omnidirectional image. Also good.
  • the display device 1A on the instructor side selects the insertion image D1 at time T12
  • the display device 1A on the instructor side displays the insertion image D1 in full screen at time T13.
  • the instructor-side display device 1B displays the insertion image D1 instead of the instructor-side image F1.
  • This insertion image D1 may be a reduced version of the insertion image D1 that is displayed on the full screen by the display device 1A, or is a cut-out part of the insertion image D1 or a partial scene. May be.
  • the instructor wants to show the instructor while allowing the instructor to see the image he wants to see by not displaying the insertion image D1 in full screen.
  • the instructed person can also be shown about the insertion image D1.
  • the display device 1 ⁇ / b> B may display the insertion image D ⁇ b> 1 on the full screen as in the example of FIG. 9.
  • the insertion image D1 may be displayed on a part of the display screen, as in the example of FIG.
  • the display device 1A on the instructor side may display the insertion image D1 on a part of the display screen.
  • FIG. 11 is a diagram illustrating an example of screen transition of the display device 1A on the leader side and the display device 1B on the trainee side when the trainee asks the leader.
  • the display device 1A on the instructor side displays the display target area A11
  • the display device 1B on the instructor side displays the display target area A12.
  • the image displayed in the display target area A11 is different from the image displayed in the display target area A12. That is, the instructor and the instructor are looking at different positions in the omnidirectional image.
  • the trainee points the position (in this example, the leftmost building in the display target area A12) where he / she wants to ask a question by an input operation via the input unit 19B.
  • the pointer M3 is displayed in the display target area A12.
  • the position of the pointer M3, a predetermined range around the position, or the object (building) displayed at the position of the pointer M3 is an instruction area. Thereafter, unless the instructor performs input operation or line-of-sight movement, the display content of the instructor-side display device 1B does not change.
  • the display device 1A on the instructor side displays an arrow M2 toward the indication area to prompt the instructor to move his / her line of sight to the left.
  • the leader moves the line of sight according to the arrow M2, so that the building pointed to by the instructor enters the display target area A11, and the arrow M2 is displayed on the display device 1A on the instructor side.
  • the instructor-side display device 1A displays the insertion images D1 and D3 as selection candidates.
  • the insertion images D1 and D3 are images having contents corresponding to the position where the pointer M3 is displayed. By selecting the insertion image D1 or D3, the instructor can display the insertion image D1 or D3 on the display device 1B on the instructor side in the manner shown in FIG.
  • region by a to-be-trained person is not restricted to the objective of a question.
  • FIG. 12 is a flowchart illustrating an example of a process in which the display device 1 displays an image
  • FIG. 13 is a flowchart illustrating an example of the insertion process performed in S6 of FIG.
  • the instructor designates an instruction area during reproduction of content on the display device 1A.
  • the line-of-sight direction specifying unit 11A specifies the line-of-sight direction of the user wearing the display device 1A. Then, the line-of-sight direction specifying unit 11A notifies the omnidirectional image drawing unit 16A of the specified line-of-sight direction.
  • the omnidirectional image drawing unit 16A specifies the display target region in the omnidirectional image 23A from the line-of-sight direction notified from the line-of-sight direction specifying unit 11A, and specifies the above-described display target via the combining unit 15A. The image of the area is drawn (displayed) on the display unit 21A.
  • the instruction receiving unit 12A acquires instruction information. Specifically, the instruction receiving unit 12A receives an instruction from the instructor (an instruction to point at an arbitrary position on the display target area) via the input unit 19A. Then, the instruction receiving unit 12A acquires the instruction information by generating instruction information indicating the received instruction content. Since the instruction information includes information indicating the instruction area, the instruction area is specified by the process of S3.
  • the instruction receiving unit 12A stores the acquired instruction information as instruction information 24A in the storage unit 22A, and the instruction transmitting / receiving unit 13A transmits the instruction information 24A to the display device 1B of the instructor.
  • the instruction display processing unit 14A determines whether or not the instruction target (instruction area) of the instruction information acquired in S3 is within the display target area. In other words, the instruction display processing unit 14A determines whether or not the instruction area is included in the display target area. If the designated area is within the display target area (YES in S4), the process proceeds to S5, and if the designated area is not within the display target area (NO in S4), the process proceeds to S8. When the designation of the indication area is accepted within the display target area, the indication area is always within the display target area. Therefore, in the display device 1A on the instructor side, S4, S7, and S8 are omitted and the process proceeds to S5. You may go on.
  • the instruction display processing unit 14A displays the instruction content indicated by the instruction information acquired in S3 via the synthesis unit 15A. Specifically, the pointer M1 is superimposed and displayed on the instruction area in the image of the display target area.
  • the instruction display processing unit 14A determines whether or not there is a supervisor authority. If it is determined here that there is a supervisor authority (YES in S6), the process proceeds to S7. In S7, the insertion processing unit 17A performs the insertion processing, and then the processing returns to S1. On the other hand, if it is determined in S6 that there is no instructor authority (NO in S6), the process returns to S1. In addition, since the process of 1 A of display apparatuses by the side of the leader is demonstrated here, it determines with YES in S6.
  • the instruction display processing unit 14A determines whether or not the instruction area of the instruction information acquired in S3 is outside the display target area. If the designated area is outside the display target area (YES in S8), the process proceeds to S9. If the designated area is not outside the display target area (NO in S8), the process returns to S1.
  • the instruction display processing unit 14A displays guidance information that prompts the user to move his / her line of sight so that the instruction area outside the display target area is within the display target area via the synthesis unit 15A. Thereafter, the process returns to S1.
  • the process performed on the display device 1B of the instructor when the instructor designates the instruction area with the pointer M3 or the like is the same as described above.
  • the instruction information 24A stored in advance in the storage unit 22A may be acquired.
  • NO is determined in S4
  • YES is determined in S8, and the instruction display processing unit 14A displays the guidance information in S9.
  • NO is determined in S4
  • NO is determined in S8, and the process returns to S1. In this way, by storing the instruction information 24A in advance, it is possible to allow the instructor to advance learning without overlooking important parts of the content without giving instructions.
  • the insertion processing unit 17 ⁇ / b> A refers to the insertion information 25 ⁇ / b> A and determines whether there is an insertion image to be displayed. If there is an inserted image (YES in S11), the process proceeds to S12, and if there is no inserted image (NO in S11), the process returns to S1 in FIG.
  • the insertion processing unit 17A causes the insertion image candidate to be displayed superimposed on the image of the display target area via the synthesis unit 15A.
  • the candidate to be displayed may be a reduced version of the insertion image as shown in FIG.
  • the insertion processing unit 17A determines whether or not an insertion image has been selected. For example, the insertion processing unit 17A may determine that the insertion image corresponding to the selected candidate has been selected when any of the displayed candidates is selected by the instructor. On the other hand, when a state where none of the displayed candidates is selected continues for a predetermined time, or when an input operation of a leader who selects not to display the inserted image is detected, the inserted image is selected. It may be determined that there was not. If it is determined in S13 that the insertion image has been selected (YES in S13), the process proceeds to S14, and if it is determined that it has not been selected (NO in S13), the process returns to S1 in FIG.
  • the insertion processing unit 17A displays the insertion image on the display device 1B by transmitting an instruction to display the selected insertion image to the instructor-side display device 1B via the communication unit 20A. Let The insertion processing unit 17A may also display the selected insertion image on the display unit 21A via the synthesis unit 15A. Thereafter, the processing returns to S1 in FIG.
  • the line-of-sight direction specifying unit 11B specifies the line-of-sight direction of the user wearing the display device 1B
  • the omnidirectional image drawing unit 16B is an image of the display target area. Is drawn (displayed) on the display unit 21A.
  • the instruction transmission / reception unit 13B acquires instruction information. Specifically, the instruction transmission / reception unit 13B acquires the instruction information transmitted by the instructor-side display device 1A through communication via the communication unit 20B. Since the instruction information includes information indicating the instruction area, the instruction area is specified by the process of S3.
  • the instruction transmitting / receiving unit 13B stores the acquired instruction information as instruction information 24B in the storage unit 22B.
  • the instruction display processing unit 14B determines whether or not the instruction target (instruction area) of the instruction information acquired in S3 is within the display target area. If the designated area is within the display target area (YES in S4), the process proceeds to S5, and if the designated area is not within the display target area (NO in S4), the process proceeds to S8.
  • the instruction display processing unit 14B displays the instruction content indicated by the instruction information acquired in S3 via the synthesis unit 15B. Specifically, the pointer M1 is superimposed and displayed on the designated area in the display target area image.
  • the instruction display processing unit 14B determines whether or not there is a supervisor authority. If it is determined here that there is a supervisor authority (YES in S6), the process proceeds to S7. On the other hand, if it is determined in S6 that there is no instructor authority (NO in S6), the process returns to S1. In addition, since the process of the display apparatus 1B by the side of a leader is demonstrated here, it determines with NO in S6 and returns to S1.
  • the instruction display processing unit 14B determines whether or not the instruction area of the instruction information acquired in S3 is outside the display target area. If the designated area is outside the display target area (YES in S8), the process proceeds to S9. If the designated area is not outside the display target area (NO in S8), the process returns to S1.
  • the instruction display processing unit 14B displays guidance information that prompts the user to move his / her line of sight so that the instruction area outside the display target area is within the display target area via the synthesis unit 15B. Thereafter, the process returns to S1.
  • the process performed on the display device 1A of the instructor when the instructor designates the instruction area with the pointer M3 or the like is the same as described above.
  • the instruction information 24B stored in advance in the storage unit 22B may be acquired.
  • NO is determined in S4
  • YES is determined in S8, and the instruction display processing unit 14B displays the guidance information in S9.
  • NO is determined in S4
  • NO is determined in S8, and the process returns to S1.
  • the instructor-side image F1 described based on FIG. 10 can be displayed on the initiative of the instructor's display device 1B, or can be displayed on the initiative of the instructor's display device 1A.
  • the instruction display processing unit 14B determines whether the instruction area is included in the display target area after displaying the guidance information in S9. If it is not determined that the instruction area is included in the display target area within a predetermined period after the guidance information is displayed, the instruction display processing unit 14B cuts out an image including the instruction area from the omnidirectional image 23B. This is displayed on the display unit 21B as an instructor side image.
  • the instruction transmission / reception unit 13A transmits the instruction information to the instructor's display device 1B, and then inquires the display device 1B so that the instruction area is the display device 1B. Check whether it is included in the display target area. If it is not confirmed that the instruction area is included in the display target area within a predetermined period after the instruction information is transmitted, the instruction transmission / reception unit 13A instructs the display device 1B to display the instructor-side image. To do. As a result, in the instructor's display device 1B, the instruction display processing unit 14B cuts out an image including the instruction area from the omnidirectional image 23B and causes the display unit 21B to display the image as the instructor side image.
  • FIG. 14 is a block diagram illustrating an example of a main configuration of the display device 200 that displays content in which a superimposed image is superimposed on an omnidirectional image.
  • the configuration of the display device 200 is the same on the instructor side and the instructor side.
  • the control unit 210 of the display device 200 includes a superimposition processing unit 221, and the storage unit 222 includes a superimposed image 223 and superimposed image management information 224.
  • the superimposition processing unit 221 displays the superimposed image 223 on the omnidirectional image according to the superimposed image management information 224.
  • FIG. 15 is a diagram illustrating content displayed on the display device 200.
  • the illustrated content is obtained by superimposing the superimposed images B1, B2, B3, and B4 on the omnidirectional image A1.
  • the superimposed image B1 is an image obtained by enlarging a clock installed in a building that is an imaging target of the omnidirectional image A1.
  • the time can be easily read by the instructor.
  • a superimposed image obtained by capturing them from the front angle may be displayed. As described above, the superimposed image may be captured at an angle different from that of the omnidirectional image A1.
  • the superimposed image B2 is an image indicating an annotation (annotation) related to the imaging target of the omnidirectional image A1. Specifically, the superimposed image B2 indicates the name of the building that is the imaging target and its tenant. Thus, by displaying information about the imaging target as an annotation, the instructor's understanding of the imaging target can be deepened.
  • the superimposed image B3 is an image obtained by capturing a person. More specifically, the superimposed image B3 is an image obtained by photographing a person shown in the superimposed image B4 at an angle different from that of the superimposed image B4, and the person is enlarged than the superimposed image B4. As described above, the superimposed image may be an image related to another superimposed image.
  • the superimposed image B4 is an image obtained by capturing a plurality of people.
  • the superimposed image B4 may be an image obtained by enlarging a region where the superimposed image B4 is displayed in the omnidirectional image A1.
  • the superimposed image B4 may be an image captured by an imaging device different from the imaging device that captured the omnidirectional image A1.
  • superimposition image B4 can be made into a higher resolution than omnidirectional image A1, and a fine picture of an imaging subject contained in superposition image B4 can be shown to a leader and a leader.
  • the instructor and the instructor can confirm various information related to the omnidirectional image A1 only by moving the line-of-sight direction.
  • the superimposed image may be an image related to the omnidirectional image.
  • the number of superimposed images to be superimposed on one omnidirectional image is not particularly limited, and may be one, for example.
  • the omnidirectional image A1 is an image in which the cityscape is an imaging target, but the imaging target is arbitrary.
  • the omnidirectional image A ⁇ b> 1 may be an image obtained by imaging a state in which surgery is performed.
  • the imaging target may include a surgeon, an assistant, a patient, a surgical instrument, various devices, and the like.
  • the display device 200 can be used for medical education.
  • the user can check each person's progress while confirming the progress of the entire operation in the omnidirectional image A1. Can learn what to look for during surgery.
  • a superimposed image B3 of the surgeon's eyes may be displayed when used for surgeon education
  • a superimposed image B3 of the assistant's eyes may be displayed when used for assistant education.
  • the superimposed image B1 by using an image obtained by capturing a screen displaying patient vital data as the superimposed image B1, it is possible to make the user recognize the relationship between the transition of vital data during surgery and the action of each person corresponding thereto.
  • a high-resolution image of the operative field may be used as the superimposed image B3, thereby allowing the user to recognize the details of the detailed operation of the surgeon.
  • CT Computerized Tomography
  • images or images of 3D modeling data generated based on CT images, ultrasonic images, measured values such as illuminance, temperature, humidity, pressure, speed, and graphs, and other necessary for surgery Information, device operation information (for example, on / off of heart-lung machine), and the like may be displayed as the superimposed image B2.
  • the image on which the superimposed image is superimposed is not limited to the omnidirectional image, and may be a planar image.
  • the superimposed image management information 224 may be information as shown in FIG. 16, for example.
  • FIG. 16 is a diagram illustrating an example of the superimposed image management information 224.
  • the superimposed image management information 224 in FIG. 16 includes “superimposition target”, “azimuth angle range”, “elevation angle range”, “display position (depth)”, “with / without perspective”, “transmittance”, and “ It is data in a table format in which information of “superimposition decoration method” is associated.
  • Superimposition target is information indicating the superposition target, and in this example, the name of the superimposed image that is the superposition target is described. In the example of FIG. 16, only the superimposed image is shown as the superimposition target, but other than the image may be the superimposition target. For example, an annotation (annotation) related to the omnidirectional image or the superimposed image, a UI (User Interface) menu for operating the display device 200, or the like may be set as the superimposition target.
  • “Azimuth range” and “elevation range” are information indicating the display area to be superimposed.
  • the superimposed image B1 has an azimuth range of ⁇ 70 ° to ⁇ 10 ° and an elevation angle range of 20 ° to 45 °. Therefore, the superimposed image B1 is displayed in a rectangular area having a left azimuth angle of ⁇ 70 °, a right azimuth angle of ⁇ 10 °, a lower end elevation angle of 20 °, and an upper end elevation angle of 45 °.
  • the display area to be superimposed can be represented by, for example, a combination of a reference position of the overlay target (such as a rectangular lower left corner of the overlay target), a height of the overlay target, and a width of the overlay target.
  • Display position (depth) is information indicating the display position in the depth direction to be superimposed.
  • r is the farthest display position (see FIG. 3).
  • “Perspective presence / absence”, “transmittance”, and “superimposition decoration method” are information indicating the display mode of the superimposition target. More specifically, “with / without perspective” is information indicating whether or not to perform perspective display, that is, display by perspective projection. The superimposition target with perspective is displayed three-dimensionally by perspective projection, and the superimposition target without perspective is displayed without using perspective projection. The “transmittance” is information indicating the transmittance to be superimposed. If the transmittance of the superimposition target is greater than zero, the omnidirectional image in the portion where the superimposition target is superimposed can be visually recognized. The “superimposition decoration method” is information indicating whether or not to perform image processing for blurring the outline to be superimposed.
  • the superimposed image management information 224 may be stored in advance in the storage unit 222 or the like as meta information. Further, the superimposed image management information 224 may be generated or edited. For example, the omnidirectional image is displayed in two dimensions, and the superimposition position of the superimposition image is determined on the two-dimensional omnidirectional image by accepting designation of the superimposition position by the user, and the determined content is superimposed. The image management information 224 may be reflected.
  • FIG. 17 is a flowchart illustrating an example of processing when the display device 200 displays a superimposed image. Note that S21 and S22 in FIG. 17 are the same processes as S1 and S2 in FIG.
  • the superimposition processing unit 221 specifies a display target area in the omnidirectional image 23 from the line-of-sight direction specified by the line-of-sight direction specifying unit 11, and determines whether or not there is a superimposition target to be displayed in the display target area. judge. Specifically, the superimposition processing unit 221 determines that there is a superimposition target if the superimposition target at least part of which is included in the display target area is included in the superimposition image management information 224, and if not included. It is determined that there is no superimposition target. If it is determined that there is a superimposition target (YES in S23), the process proceeds to S24. If it is determined that there is no superposition target (NO in S23), the process returns to S21.
  • the superimposition processing unit 221 acquires information indicating the superimposition position and display mode of the superimposition target from the superimposed image management information 224.
  • the superimposition processing unit 221 includes information indicating an azimuth angle range, an elevation angle range, a display position (depth), presence / absence of perspective, transmittance, and a superimposed decoration method. Get each.
  • the synthesizing unit 15 synthesizes the superimposition target on the display target region portion of the omnidirectional image drawn in S22 and causes the display unit 21 to display it. At this time, the synthesizing unit 15 synthesizes the superimposed image 223 read from the storage unit 222 with the display target region portion of the omnidirectional image in the position and manner indicated by each piece of information acquired by the superimposing processing unit 221 in S24. Thereafter, the process returns to S21.
  • the processing when the designated area is designated on the superimposed image 223 displayed in this way is the same as in the first embodiment. That is, when the instruction area is specified on the superimposed image 223 on the instructor's display device 200, the instructor's display apparatus 200 displays an arrow and displays the instruction unless the instruction area is displayed. Encourage the instructor to display the area. The same applies when the instructor designates the designated area.
  • information indicating the superimposed image 223 is stored as the “field of view” information as the instruction information 24 in FIG.
  • the insertion information 25 as shown in FIG. 8 when displaying the insertion image, the insertion information 25 as shown in FIG. 8 may be referred to.
  • the superimposed image and the insertion image are associated with each other. Therefore, the insertion image corresponding to the pointing superimposed image 223 is specified by referring to the insertion information 25. be able to.
  • FIGS. 3 Still another embodiment of the present invention will be described with reference to FIGS.
  • the instructor controls the reproduction of the content of the instructor.
  • the apparatus configuration in the present embodiment is substantially the same as that in the first embodiment (see FIG. 1).
  • the display devices 1A and 1B of the present embodiment include a playback control unit that controls playback of content in order to play back content that is a moving image.
  • the display devices 1A and 1B include a reproduction control unit.
  • the instruction transmission / reception unit (reproduction instruction unit) 13A performs content reproduction control on the display device 1B. More specifically, the instruction transmission / reception unit 13A plays back the content in the playback time area on the display device 1B when a playback time region of content different from the display device 1B is designated as a playback target in the display device 1A.
  • FIG. 18 is a diagram illustrating an example of a display screen when the instructor controls the reproduction of the content of the instructor.
  • the display device 1A on the instructor side displays the display target area A11
  • the display device 1B on the instructor side displays the display target area A12.
  • a seek bar H1 which is a UI for controlling the playback position of content, is displayed in both display target areas A11 and A12.
  • the seek bar H1 includes a pointer H11 indicating a playback position. By moving the pointer H11 to the right, the playback position can be advanced, and by moving the pointer H11 to the left, the playback position can be set. You can go back.
  • the playback position of the content is the same between the display device 1A on the instructor side and the display device 1B on the instructor side.
  • the instructor can designate an arbitrary reproduction position on the seek bar H1 by an input operation via the input unit 19A.
  • the playback position is designated by moving the pointer J1 on the seek bar H1 to a desired playback position and performing a predetermined confirmation operation.
  • the reproduction control unit performs reproduction control to change the reproduction position of the content, and this is designated in the display unit 21A.
  • the display target area A11 at the reproduction position is displayed.
  • the instruction transmission / reception unit 13A instructs the display device 1B to execute the reproduction control by the reproduction control unit through communication via the communication unit 20A.
  • the display control unit changes the reproduction position of the content, and the display target area A12 at the designated reproduction position is displayed on the display unit 21B.
  • the pointer J1 may designate the end point or start point of the reproduction range.
  • both the display devices 1A and 1B perform content reproduction in the reproduction range from the reproduction position immediately before the reproduction range is designated to the reproduction position of the pointer J1.
  • the instructor's display device 1B shifts the content reproduction position to the reproduction position designated by the instructor.
  • the display device 1B of each trainee similarly changes the playback position.
  • the instructor can perform the instruction while showing the instructor the image intended by the instructor.
  • playback control such as playback start, playback pause, playback stop, playback restart, reverse playback, and frame advance is also performed on the display device 1A on the instructor side.
  • the same reproduction control as the control may be performed on the instructor-side display device 1B.
  • the instructor may be able to select whether or not to coordinate the playback control. For example, when playback control is performed in a state where the display device 1A on the instructor side is set to a mode in which the playback control is linked, the content of the playback control is reflected on the display device 1B on the instructor side. May be. On the other hand, when the playback control is performed in a state where the playback control is not linked, the content of the playback control may be reflected only on the display device 1A on the instructor side.
  • playback may be started from the beginning of the scene corresponding to the designated playback position (so-called scene cueing).
  • scene cueing a thumbnail image indicating the content of the content in each scene is displayed around the seek bar H1 (for example, above), and from the beginning of the scene corresponding to the selected thumbnail image Playback may be started.
  • the content to be reproduced is a medical education content obtained by photographing a medical practice consisting of a plurality of processes, it is possible to designate a reproduction position in units of medical processes.
  • FIG. 19 is a diagram showing an example of a display screen when the playback position of the content is designated on the instructor's display device 1B.
  • the display device 1A on the instructor side displays the display target area A11
  • the display device 1B on the instructor side displays the display target area A12.
  • the playback position of the content is the same on the display device 1A on the instructor side and the display device 1B on the instructor side.
  • the instructor can designate an arbitrary reproduction position on the seek bar H1 by an input operation via the input unit 19B.
  • the playback position is designated by moving the pointer J1 on the seek bar H1 to a desired playback position and performing a predetermined confirmation operation.
  • the instruction transmission / reception unit 13B Upon receiving this designation, in the display device 1B on the instructor side, the instruction transmission / reception unit 13B notifies the display device 1A of the reproduction position by communication via the communication unit 20A.
  • the instruction display processing unit 14A displays the pointer J2 at the notified position on the seek bar H1. Thereby, the leader can recognize the reproduction position designated by the trainee.
  • the playback control unit of the display device 1B may start playback from the position indicated by the pointer J1.
  • the instructor may perform an input operation such as selecting the pointer J2 to change the reproduction position to the position indicated by the pointer J2. Thereby, guidance can be performed in cooperation with the playback position between the leader and the trainee.
  • the designation of the playback position by the pointer J1 may be omitted.
  • the instructor designates the instruction area for the content being reproduced on the display device 1B
  • the content is reproduced at a position different from that of the instructor's display device 1A.
  • a pointer J2 indicating the playback position on the user side may be displayed.
  • the instruction information 24 for designating the content reproduction time area as the instruction area may be, for example, as shown in FIG. FIG. 20 is a diagram illustrating an example of instruction information specifying a content playback time region.
  • the instruction information 24 in FIG. 20 differs from the instruction information 24 in FIG. 7 in that the “instruction area” information is “none” and the “detected object” information is not included. ing.
  • the instruction information 24 in FIG. 20 is generated when the instructor or the instructor designates the content playback time zone (playback time region).
  • the top record of the instruction information 24 in FIG. 20 shows the content including the omnidirectional image A1 by the instructor X1 who is viewing the omnidirectional image A1 through an input operation via the input unit 19A.
  • the instruction reception unit 12A When the time zone of 00:01:00 to 00:05:00 is specified, the instruction reception unit 12A generates the time zone.
  • the instruction transmitting / receiving unit 13B receives the instruction information 24, and the display control unit 00:01:00 to 00:05:00 in the content being reproduced. Play the time zone.
  • the instruction receiving unit 12B when the instructor Y2 displays the superimposed image B2 on the display device 1B and designates the scene 001 in the superimposed image B2 by an input operation via the input unit 19B, the instruction receiving unit 12B However, the bottom record of the instruction information 24 in FIG. 20 is generated.
  • the instruction transmission / reception unit 13A receives the instruction information 24, and the instruction display processing unit 14A receives information indicating the scene 001 via the synthesis unit 15A or the like. It is displayed on the display unit 21A.
  • the information indicating the scene 001 may be information such as the pointer J2 of FIG. 19 displayed on the seek bar H1, for example.
  • the left end position of the pointer J2 may indicate the start position of the scene 001
  • the right end position may indicate the end position.
  • a playback time area is specified, an example in which playback is forcibly started from the playback time area has been described.
  • a guidance process that prompts the instructor or instructor to play back the specified playback time area. May be performed.
  • the instruction display processing unit 14B displays the pointer J2 in FIG. You may encourage the leader.
  • the instructor may be prompted to reproduce the reproduction time area designated by the instructor by displaying a message such as “Please reproduce scene 001” or outputting the sound.
  • FIG. 21 is a diagram showing an overview of a control system 400 according to an embodiment of the present invention.
  • the control system 400 includes a display control device (control device) 41, display devices 42A and 42B, and a distribution server 43. When there is no need to distinguish between the display devices 42A and 42B, they are simply referred to as the display device 42. Further, the control system 400 may include three or more display devices 42.
  • Each component of the control system 400 is communicably connected via a predetermined communication network (for example, the Internet). Therefore, if the user (instructor or instructor) of each display device 42 is in an environment that can be connected to the communication network, it is assumed that the user (instructor or instructor) is in a remote place away from other users (instructor or instructor). Also, the control system 400 can be used.
  • a predetermined communication network for example, the Internet
  • the display control device 41 controls the display of the display device 42, and the display device 42 displays an image according to the control of the display control device 41.
  • the display control device 41 includes blocks corresponding to the control unit 10A (instruction transmission / reception unit 13A, instruction display processing unit 14A, insertion processing unit 17A) and storage unit 22A among the blocks illustrated in FIG.
  • the display control device 41 includes an instruction transmission / reception unit 13, an instruction display processing unit 14, and an insertion processing unit 17 as blocks corresponding to the instruction transmission / reception unit 13A, the instruction display processing unit 14A, and the insertion processing unit 17A, respectively.
  • the explanation will be made assuming that
  • the display device 42A includes a control unit 10A (line-of-sight direction specifying unit 11A, instruction receiving unit 12A, instruction transmitting / receiving unit 13A, instruction display processing unit 14A, combining unit 15A, omnidirectional image drawing unit among the blocks shown in FIG. 16A), a sensor 18A, an input unit 19A, a communication unit 20A, a display unit 21A, and a block corresponding to the storage unit 22A.
  • a control unit 10A line-of-sight direction specifying unit 11A, instruction receiving unit 12A, instruction transmitting / receiving unit 13A, instruction display processing unit 14A, combining unit 15A, omnidirectional image drawing unit among the blocks shown in FIG. 16A
  • a sensor 18A an input unit 19A
  • a communication unit 20A a communication unit 21A
  • a display unit 21A a display unit 21A
  • a block corresponding to the storage unit 22A a block corresponding to the storage unit 22A.
  • the display device 42B includes a control unit 10B (line-of-sight direction specifying unit 11B, instruction receiving unit 12B, instruction transmitting / receiving unit 13B, instruction display processing unit 14B, combining unit 15B, omnidirectional image drawing unit 16B), sensor 18B, Blocks corresponding to the input unit 19B, the communication unit 20B, the display unit 21B, and the storage unit 22B are provided.
  • control unit 10B line-of-sight direction specifying unit 11B, instruction receiving unit 12B, instruction transmitting / receiving unit 13B, instruction display processing unit 14B, combining unit 15B, omnidirectional image drawing unit 16B), sensor 18B, Blocks corresponding to the input unit 19B, the communication unit 20B, the display unit 21B, and the storage unit 22B are provided.
  • the distribution server 43 stores information necessary for image display control, images to be displayed, and the like, and transmits them to the display control device 41 and the display device 42.
  • the distribution server 43 may store the omnidirectional image 23, the insertion image, the superimposed image, the instruction information 24, the insertion information 25, and the superimposed image management information 224.
  • the distribution server 43 may transmit the information to the display control device 41 or the display device 42 in accordance with an instruction from the display control device 41 or the display device 42.
  • the distribution server 43 may transmit the content by streaming.
  • the control system 400 may include a storage device that stores each piece of information described above, instead of the distribution server 43. In this configuration, the display control device 41 and the display device 42 read out and use necessary information from the storage device.
  • the instructor performs an input operation for designating an instruction area on the input unit 19A of the display device 42A.
  • the instruction receiving unit 12A generates instruction information including the instruction area
  • the instruction transmitting / receiving unit 13A transmits the instruction information to the display control device 41.
  • a flag indicating whether the instruction is an instruction by the instructor or an instruction by the instructor is attached so that the display control device 41 can determine whether the instruction is by the instructor or the instruction by the instructor.
  • the instruction information may be transmitted to the distribution server 43. In this case, the display control device 41 acquires the instruction information from the distribution server.
  • the instruction transmission / reception unit 13 specifies a designated area from the instruction information.
  • the instruction display processing unit 14 then communicates with the display device 42B to determine whether or not the instruction region is included in the display target area of the display device 42B. If it is determined that it is included, the instruction display processing unit 14 instructs the display device 42 to display a pointer in the instruction area. On the other hand, if it is determined that it is not included, the instruction display processing unit 14 instructs the display device 42B to execute a guidance process that prompts the instructor to display the instruction area. For example, the instruction display processing unit 14 may display the arrow M2 in FIG. 6 on the display device 42B.
  • the plug-in processing unit 17 of the display control device 41 determines that the display device 42A or the user has the authority of the instructor based on the above flag and is shown in FIG.
  • the plug-in process is performed.
  • the insertion processing unit 17 refers to the insertion information to determine whether there is an insertion image to be displayed, and if there is an insertion image to be displayed, displays a candidate on the display device 42A. Let If the insertion information is stored in the distribution server 43, the distribution information is acquired from the distribution server 43. When the insertion image is selected on the display device 42A, the insertion processing unit 17 instructs the display device 42B to display the insertion image selected by the display device 42A.
  • any processing described in the above embodiments can be executed by the control system 400.
  • FIG. 22 is a diagram illustrating an overview of a control system 500 according to the fifth embodiment of the present invention.
  • the control system 500 includes a distribution server 51, a display device (control device) 52A, and a display device 52B. When there is no need to distinguish between the display devices 52A and 52B, they are simply referred to as the display device 52. Further, the control system 500 may include three or more display devices 52.
  • the distribution server 51 and the display device 52 are communicably connected via a predetermined communication network (for example, the Internet). Therefore, it is assumed that the user (instructor or instructor) of each display device 52 is in a remote place away from other users (instructor or instructor) as long as the environment is connectable to the communication network. Also, the control system 500 can be used.
  • a predetermined communication network for example, the Internet
  • the distribution server 51 stores information necessary for image display control, an image to be displayed, and the like, as with the distribution server 43 in FIG.
  • one of the plurality of display devices 52 has the same function as the display control device 41 of the fourth embodiment. That is, one of the plurality of display devices 52 controls the operation of the other display devices 52.
  • the display device 52A used by the instructor controls the operation of the display device 52B
  • the display device 52A includes the same processing blocks as the display device 1A shown in FIG.
  • the instructor performs an input operation for designating an instruction area on the input unit 19A of the display device 52A.
  • the instruction receiving unit 12A specifies an instruction area from the input operation, and generates instruction information including the instruction area.
  • the instruction transmission / reception unit 13A communicates with the display device 52B to determine whether or not the instruction area specified by the instruction information is included in the display target area of the display device 52B. If it is determined that it is included, the instruction display processing unit 14A instructs the display device 52B to display a pointer in the instruction area. On the other hand, if it is determined that it is not included, the instruction display processing unit 14A instructs the display device 52 to execute a guidance process that prompts the instructor to display the instruction area. For example, the instruction display processing unit 14A may display the arrow M2 in FIG. 6 on the display device 52B.
  • an example in which the guidance process that prompts the user (instructor or instructor) to display the instruction area is a process that displays arrow-shaped guidance information. Anything that prompts the user to display the area is not limited to this example.
  • a message indicating the moving direction of the line of sight for setting the designated area within the display target area such as “Move the line of sight diagonally rightward” may be used as the guidance information.
  • the user may be prompted to display the instruction area by changing the display mode of the instruction area and other image areas.
  • the display brightness of the indication area may be higher than that of other areas. In this case, the user can easily find the indication area by moving the viewing direction.
  • the guidance process may be a process for outputting a message prompting the user to display the instruction area.
  • the indication area and the display target area may be shifted in the depth direction of the content.
  • This guidance process may be, for example, a process of displaying or outputting a message such as “Please adjust the viewpoint to the back / front of the screen a little more”.
  • the process which displays the mark which shows a three-dimensional depth direction may be sufficient.
  • the position in the depth direction may be represented by using the radius r of the omnidirectional sphere corresponding to the omnidirectional image as in the example of FIG. It may be expressed as
  • the content can display a partial image of a display target region that is a specified part of the entire image region.
  • the spherical image it may be a hemispherical image, or may be a flat image (such as a panoramic photo) having a display size that does not fit on one screen of the display device 1, 42, 52, 200.
  • the guidance process may be executed in a state where the image is enlarged and displayed.
  • the generated instruction information can be used as a content tag.
  • the position indicated by the instruction information is the position where the leader wants the leader to pay attention and the position where the leader wants to ask the leader, so the position indicated by the instruction information is aggregated. You may display the total result as a heat map, for example.
  • the content review can be supported by recording or printing a screen shot including the position indicated by the instruction information.
  • the message of the instructor during guidance may be displayed on the instructor's display device as an annotation.
  • Such a message may be output by voice.
  • the display devices 1, 42, 52, and 200 and the control blocks of the display control device 41 are formed by logic circuits (hardware) formed in an integrated circuit (IC chip) or the like. It may be realized or may be realized by software using a CPU (Central Processing Unit).
  • CPU Central Processing Unit
  • the display devices 1, 42, 52, 200 and the display control device 41 are read by a computer (or CPU) that executes a program instruction, which is software that realizes each function, and the program and various data.
  • a ROM (Read Only Memory) or a storage device (these are referred to as “recording media”), a RAM (Random Access Memory) in which the program is expanded, and the like are provided.
  • the computer (or CPU) reads the program from the recording medium and executes the program, thereby achieving the object of one embodiment of the present invention.
  • a “non-temporary tangible medium” such as a tape, a disk, a card, a semiconductor memory, a programmable logic circuit, or the like can be used.
  • the program may be supplied to the computer via an arbitrary transmission medium (such as a communication network or a broadcast wave) that can transmit the program.
  • an arbitrary transmission medium such as a communication network or a broadcast wave
  • one embodiment of the present invention can also be realized in the form of a data signal embedded in a carrier wave, in which the program is embodied by electronic transmission.
  • the control device (display devices 1, 52, 200, display control device 41) according to aspect 1 of the present invention is a first display device (display) that displays a partial image of a specified display target region among image regions of content. And an area specifying unit (instruction for specifying a first instruction area, which is an image area for which a user of the first display apparatus is instructed to browse). And a guidance process that prompts the user to display an image of the first instruction area when the display target area does not include the first instruction area specified by the receiving unit 12) and the area specifying unit. And a guidance unit (instruction display processing unit 14) to be executed.
  • the first instruction area may be a part of the content image area, and may be, for example, one point in the image area. The same applies to the second indication area described later.
  • a control device is the control device according to aspect 1, wherein the first display device is combined with a second display device (display devices 1, 42, 52, and 200) different from the first display device.
  • the content may be displayed, and the first indication area may be an area designated on the content displayed by the second display device.
  • the first indication area is an area designated on the content displayed by the second display device. Therefore, it is possible to prevent the user of the first display device from overlooking the first indication area designated on the content displayed by the second display device.
  • the control device is the control apparatus according to aspect 2, in which the first indication area is not included in the display target area after the guidance process is performed and after the guidance process is performed.
  • the display device may include an instruction region display processing unit (instruction display processing unit 14) that displays the image of the first instruction region displayed on the second display device on the first display device. Good.
  • the image of the first instruction area displayed on the second display device is displayed in the first display area. 1 is displayed on the display device. Therefore, even when the user of the first display device does not follow the guidance, the image of the first indication area can be reliably shown to the user.
  • the control device is the above-described aspect 2 or 3, wherein a designation receiving unit (instruction receiving unit 12) that accepts designation of a second instruction area to be browsed by a user of the second display device, When the second display device does not display the second indication area, the second display device performs a guidance process that prompts the user of the second display device to display the second indication area. It is good also as a structure provided with the guidance control part (instruction transmission / reception part 13) to be made.
  • the second indication area is displayed when the designation of the second indication area is accepted and the second indication area is displayed when the second display device does not display the second indication area.
  • the second display device is caused to execute guidance processing that prompts the user of the device. Therefore, it is possible to designate the second indication area from the first display device side so that the user of the second display device does not overlook the second indication area.
  • the control device provides the control device according to aspect 4 described above, in the case where a predetermined authority (instructor authority) is set for the second display device or a user of the second display device.
  • a predetermined authority instructor authority
  • the second display device designates the second indication area
  • the related content related to the content selected by the second display device is displayed on the first display device, while the first display is displayed.
  • the predetermined authority is not set for the user of the device and the first display device
  • the related content is displayed on the second display device when the specification receiving unit receives the specification of the second instruction area. It is good also as a structure provided with the insertion process part (17) which does not display.
  • the predetermined authority when the predetermined authority is set, it is possible to show the user the related contents related to the instruction area (first and second). On the other hand, when the predetermined authority is not set, the related content is not displayed, and thus browsing of the user's content is not hindered.
  • the control device is the control device according to any one of the aspects 1 to 5, wherein the content is a moving image, and the guidance unit is configured such that the first instruction area is a display area of a predetermined object in the content. In some cases, when the display position of the object changes, the first instruction area is moved following the change in the display position of the object, and the first instruction area after the movement is included in the display target area. If not, the above guidance process may be performed.
  • the first indication area is a display area of a predetermined object in the content and the display position of the object changes
  • the first indication area is made to follow the change in the display position of the object. Move. Then, it is determined whether or not the first indication area after movement is included in the display target area. Therefore, the user can track the moving object.
  • the control device is the control apparatus according to any one of the aspects 1 to 6, wherein the content includes at least one second content related to the first content (first celestial sphere image).
  • the content (superimposed image) may be superimposed.
  • the content is content in which at least one second content is superimposed on the first content. Therefore, the user of the first display device can view the first content by designating the display target area on the first content, and can designate the display target area on the second content.
  • the second content can be viewed. That is, the user of the first display device can view both the first content and the second content related to each other by changing the display target area. And even if it is a case where the user is looking at which content, as mentioned above, it can prevent overlooking about the 1st instruction
  • the content is a moving image
  • the second display device reproduces the content different from the first display device. It is good also as a structure provided with the reproduction
  • the content in the playback time area designated as the playback target on the second display device is also played back on the first display device. Therefore, the user of the first display device and the user of the second display device can view the same playback time region of the same content.
  • a head mounted display according to Aspect 9 of the present invention includes the control device according to any one of Aspects 1 to 8, and the first display device. According to said structure, there exists an effect similar to said aspect 1-8.
  • the control method of the control device is the first display device (display) that displays the partial image of the designated display target region among the image regions of the content.
  • a control device control method for controlling the operation of the device 1, 42, 52, 200), wherein the first specifying region is specified as an image region for which the user of the first display device is instructed to browse A guidance process for prompting the user to display an image of the first instruction area when the display target area does not include the first instruction area specified in step (S3) and the area specifying step; And a guidance step (S9) to be executed.
  • control device may be realized by a computer.
  • the control device is realized by the computer by operating the computer as each unit (software element) included in the control device.
  • a control program for the control device and a computer-readable recording medium on which the control program is recorded also fall within the scope of the present invention.
  • Display device (control device) 1A, 1B, 52A, 52B, 200 Display device (control device) 1A, 1B, 42A, 42B, 52A, 52B, 200 Display device (first display device / second display device) 12A, 12B Instruction receiving unit (region specifying unit, designation receiving unit) 13A, 13B Instruction transmission / reception unit (guidance control unit, reproduction instruction unit) 14A, 14B Instruction display processing section (guidance section, instruction area display processing section) 17A, 17B Insertion processing unit 23A, 23B, A1 Spherical image (content) D1, D2, D3 Inserted images (related content) 41 Display control device (control device) 223, B1, B2, B3 superimposed image (second content) A11, A12 Display target area M3 area (instruction area)

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • Computer Hardware Design (AREA)
  • Human Computer Interaction (AREA)
  • Controls And Circuits For Display Device (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

La présente invention permet d'empêcher l'omission d'une image dans une zone d'image prescrite. Un dispositif d'affichage (1A) comprend une unité d'acceptation d'instruction (12A) afin d'identifier une zone désignée, qui est une zone d'image qu'un utilisateur doit visualiser conformément à une instruction qui lui est donnée, et une unité de traitement d'affichage d'instruction (14A), qui encourage l'utilisateur à veiller à la présence d'une image de la zone désignée lorsque la zone désignée ne se trouve pas à l'intérieur d'une zone devant être présentée.
PCT/JP2017/046642 2017-01-24 2017-12-26 Dispositif de commande, visiocasque, procédé de commande de dispositif de commande et programme de commande WO2018139147A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2017-010315 2017-01-24
JP2017010315 2017-01-24

Publications (1)

Publication Number Publication Date
WO2018139147A1 true WO2018139147A1 (fr) 2018-08-02

Family

ID=62978250

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2017/046642 WO2018139147A1 (fr) 2017-01-24 2017-12-26 Dispositif de commande, visiocasque, procédé de commande de dispositif de commande et programme de commande

Country Status (1)

Country Link
WO (1) WO2018139147A1 (fr)

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2016002445A1 (fr) * 2014-07-03 2016-01-07 ソニー株式会社 Dispositif de traitement d'informations, procédé de traitement d'informations et programme
WO2016009864A1 (fr) * 2014-07-18 2016-01-21 ソニー株式会社 Dispositif de traitement d'informations, dispositif d'affichage, procédé de traitement d'informations, programme, et système de traitement d'informations

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2016002445A1 (fr) * 2014-07-03 2016-01-07 ソニー株式会社 Dispositif de traitement d'informations, procédé de traitement d'informations et programme
WO2016009864A1 (fr) * 2014-07-18 2016-01-21 ソニー株式会社 Dispositif de traitement d'informations, dispositif d'affichage, procédé de traitement d'informations, programme, et système de traitement d'informations

Similar Documents

Publication Publication Date Title
CN114236837B (zh) 用于显示交互式增强现实展示的系统、方法和介质
EP3019939B1 (fr) Appareil de commande d'affichage et support d'enregistrement lisible par ordinateur
US20190180509A1 (en) Apparatus and associated methods for presentation of first and second virtual-or-augmented reality content
TWI610097B (zh) 電子系統、可攜式顯示裝置及導引裝置
TWI530157B (zh) 多視角影像之顯示系統、方法及其非揮發性電腦可讀取紀錄媒體
WO2018101227A1 (fr) Dispositif de commande d'affichage, visiocasque, procédé de commande pour dispositif de commande d'affichage et programme de commande
JP2005038008A (ja) 画像処理方法、画像処理装置
JP6126271B1 (ja) 仮想空間を提供する方法、プログラム及び記録媒体
US12106678B2 (en) Procedure guidance and training apparatus, methods and systems
US20220375358A1 (en) Class system, viewing terminal, information processing method, and program
JP2021512402A (ja) マルチビューイング仮想現実ユーザインターフェース
US10732706B2 (en) Provision of virtual reality content
US20200077021A1 (en) Image processing apparatus and method, and program
WO2018139073A1 (fr) Dispositif de commande d'affichage, second dispositif d'affichage, procédé de commande de dispositif de commande d'affichage et programme de commande
KR102200115B1 (ko) 다시점 360도 vr 컨텐츠 제공 시스템
JP2017208808A (ja) 仮想空間を提供する方法、プログラム及び記録媒体
US11308670B2 (en) Image processing apparatus and method
JP2017207595A (ja) 仮想空間を提供する方法、プログラム及び記録媒体
JP7465737B2 (ja) 授業システム、視聴端末、情報処理方法及びプログラム
WO2018139147A1 (fr) Dispositif de commande, visiocasque, procédé de commande de dispositif de commande et programme de commande
JP2023140922A (ja) 表示端末、情報処理システム、通信システム、表示方法、情報処理方法、通信方法、及びプログラム
JP2017208809A (ja) 仮想空間を提供する方法、プログラム及び記録媒体
JP5647813B2 (ja) 映像提示システム、プログラム及び記録媒体
JP5172794B2 (ja) ビデオコミュニケーションシステム、方法およびプログラム
US20240321237A1 (en) Display terminal, communication system, and method of displaying

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17893989

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 17893989

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP