[go: up one dir, main page]

WO1997036425A1 - Traitement d'images video - Google Patents

Traitement d'images video Download PDF

Info

Publication number
WO1997036425A1
WO1997036425A1 PCT/GB1997/000641 GB9700641W WO9736425A1 WO 1997036425 A1 WO1997036425 A1 WO 1997036425A1 GB 9700641 W GB9700641 W GB 9700641W WO 9736425 A1 WO9736425 A1 WO 9736425A1
Authority
WO
WIPO (PCT)
Prior art keywords
signals
input
composite
data
picture
Prior art date
Application number
PCT/GB1997/000641
Other languages
English (en)
Inventor
Gary Dean Burgess
Original Assignee
British Telecommunications Public Limited Company
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from GBGB9606511.5A external-priority patent/GB9606511D0/en
Application filed by British Telecommunications Public Limited Company filed Critical British Telecommunications Public Limited Company
Priority to AU21022/97A priority Critical patent/AU2102297A/en
Priority to JP9534101A priority patent/JP2000507418A/ja
Publication of WO1997036425A1 publication Critical patent/WO1997036425A1/fr

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/14Systems for two-way working
    • H04N7/15Conference systems

Definitions

  • Videoconferencing can be regarded as a technological substitute for face- to-face meetings.
  • current technology allows one set of participants to see the other set of participants.
  • multipoint videoconferencing current systems generally provide a view of only one other location at a time, owing to cost and technology constraints,
  • CIF common intermediate format
  • a multipoint videoconference is generally controlled by a multipoint control unit (MCU) which processes the audio and video signals from each location separately.
  • MCU multipoint control unit
  • the MCU is usually provided as a separate piece of equipment but may form an intergral part of one of the participating terminals.
  • the MCU generally provides an open audio-mixing system, where all participants are able to hear all other participants but not themselves.
  • each terminal is only able to see one other participating terminal, the MCU switching the video from selected terminals to be seen at the other terminals.
  • Various methods for selecting who is seen at a particular terminal are known. Two of the most popular involve selecting the picture automatically from the terminal where someone is speaking or having a chair person controlling which picture is seen by whom.
  • European Patent Application No. 523629 relates to such a multipoint teleconferencing system.
  • a chairperson is located at one of the terminals to control which pictures are viewed by the participants.
  • Each participant receives the same video signal as the other participants for display.
  • European patent application no. 642271 desc ⁇ bes videoconferencing apparatus in which a multipoint control unit selects every nth field of the incoming video signals to derive a single output signal which is sent to the participants. Again all participants receive the same video signal.
  • a more desirable approach to multipoint videoconferencing would be to enable participants to be seen and heard at all times during the conference, making a videoconference closer to a real face-to-face meeting.
  • image processing apparatus comprises input means for receiving input signals from n terminals, where n is an integer greater than or equal to 3, each input signal representing frames of a video signal, processing means for forming n composite signals each representing different combinations of at least two of the input signals, and means for transmitting the composite signals to the relevant terminal.
  • processing means comprises means for identifying control data in each input signal, means for redefining the control data for inclusion in the composite signals and means for inserting video data from the input signals into the composite signals.
  • the frame rate of the composite signals may be equal to the highest frame rate of the input signals or equal to a predetermined fixed rate.
  • the input signals conform to the quarter Common Intermediate Format and the composite signals conform to the Common Intermediate Format.
  • a method of processing image data from a plurality of terminals comprises receiving the input signals from n terminals, where n is an integer greater than or equal to 3, processing the input signals to form n composite signals representing combinations of at least two input signals, each composite signal being different, and transmitting the composite signals to the relevant terminals.
  • the composite signals may represent combinations of four input signals, the input signals preferably being selected on the basis of at which terminal the most recent speakers are located.
  • the method preferably includes identifying control data in each input signal, redefining the control data for inclusion in the composite signals and inserting video data from the input signals into the composite signals.
  • Figure 1 shows schematically a multipoint videoconference
  • Figure 2 shows an area of a video image divided into blocks
  • Figure 3a shows a macro block consisting of four luminance and two chrominance blocks
  • Figure 3b shows a group of blocks (GOB).
  • Figure 3c shows the structure of a whole image consisting of twelve groups of blocks according to the common intermediate format and three groups of blocks according to quarter CIF;
  • Figure 4 shows the framing structure for an H.261 encoded picture;
  • FIG. 5 shows the functional elements of apparatus according to the invention
  • Figure 6 shows schematically a CIF picture formed from four QCIF pictures, according to the invention
  • Figure 7 shows an example of a look-up table defining the new GOB numbering of video data for each output
  • FIG 8 shows the functional elements of an alternative embodiment of apparatus according to the invention.
  • a multipoint videoconference involves at least three locations, a videoconferencing terminal 1 2 being provided at each location. The locations might be in the same country or spread over a number of countries.
  • a multipoint control unit (MCU) 14 controls the videoconference and performs all the required audio and video mixing and switching and the control signalling.
  • Each terminal 1 2 is connected to the MCU 14 via broadband digital links such as Intergrated Services Digital Network (ISDN) B- channels.
  • ISDN Intergrated Services Digital Network
  • Each terminal 1 2 conforms to the H.261 standard and is capable of transmitting CIF or QCIF pictures. On commencement of a videoconference all participating terminals signal their capabilities to the MCU which then signals to the terminals to request the data in QCIF format.
  • images are divided into blocks 22 as shown in Figure 2 for subsequent processing.
  • the smallest block size is an 8x8 pixel block but other sized blocks may be employed.
  • a group of four such luminance (Y) blocks, and the two corresponding chrominance (C b and C r ) blocks, that cover the same area at half the luminance resolution, are collectively called a macro block (MB) as shown in Figure 3a.
  • Thirty-three macro blocks, grouped and numbered as shown in Figure 3b, are known as a group of blocks (GOB) .
  • the GOBs, grouped and numbered as shown in Figure 3c form a full CIF or QCIF picture.
  • the framing structure for a frame of H.261 encoded data is shown in Figure 4.
  • the structure is organised in a series of layers, each containing information relevant to the succeeding layers.
  • the layers are arranged as follows: Picture layer 401 ; GOB layer 403; MB layer 405; and Block layer 407.
  • Each of the layers has a header.
  • the picture header 402 includes information relating to the picture number of the encoded picture, the type of picture (e.g. whether the picture is intraframe coded or interframe coded) and Forward Error Correction (FEC) codes.
  • the GOB header 404 includes information relating to the GOB number within the frame and the quantising step size used to code the GOB.
  • the MB header 406 includes information relating to the MB number and the type of the MB (i.e. intra/inter, foward/backward predicted, luminance/chrominance etc.) . ⁇ « -,, « .. « O 97/36425
  • Figure 5 shows apparatus according to the invention for combining four
  • Each individual terminal 1 2 participating in the videoconference supplies QCIF H.261 formatted video data to the MCU 14.
  • the apparatus shown in Figure 5 receives five QCIF pictures from the participating terminals and produces CIF signals representing each combination of four QCIF pictures into a 2x2 array of QCIF coded pictures.
  • the resulting CIF signals are then transmitted to the appropriate participating terminals 1 2 for display on a display capable of displaying CIF resolution pictures.
  • the apparatus shown only operates upon the video signals from the terminals 1 2: the audio, user data information and signalling are controlled in a conventional manner by the host
  • the apparatus comp ⁇ ses five inputs 51 a-e for receiving QCIF format signals from five participating terminals 1 2.
  • Each input signal is input to a forward error correction (FEC) decoder 52a-e which decodes each FEC codes contained in the picture header 402 of each signal, error corrects the video data of the signal in a conventional manner and establishes framing locks on each input signal.
  • FEC decoder 52 signals this to a control means 54.
  • the control means 54 may be provided by a microprocessor.
  • the error corrected QCIF signals are then input to first-in-first-out (FIFO) input buffers 53 a-e.
  • FIFO first-in-first-out
  • the control means 54 searches each contributing error-corrected QCIF signal to identify header codewords (such as the GOB header 404 and the MB header 406). This is achieved by a device 55 which decodes the attributed data in the FEC-corrected QCIF signals output from the input buffers 53.
  • the device 55 comprises a series of comparators (not shown) and a shift register (not shown) of sufficient length to hold the longest code word.
  • the comparators compare the data as it enters the shift register and when a code word is identified, it is forwarded to the control means 54 via a bus 55a.
  • the shift registers then perform a serial to parallel conversion to organise the input video data into bytes for output via bus 55b and convenient storage in random access memory (RAM) 56.
  • RAM random access memory
  • a suitable device 55 to perform these operations is a Field Programmable Gate Array (FPGA) such as the Xylinx device.
  • FPGA Field Programmable Gate Array
  • Each GOB will thus be reorganised into a number of words ( 1 6 bit or 32 bit) having newly assigned byte boundaries, since H.261 signals are not originally organised in bytes.
  • the bytes of data allocated to a particular GOB may inevitably contain data not relevant to that GOB; this data will form part of the first and last bytes of the GOBs concerned. These first and last bytes are marked to state the number of valid bits that they contain.
  • the control means 54 monitors the status of the data content of the individual input buffers 53a-e via an input control device 60 (such as a FPGA) to ensure that there is no overflow or underflow of data in the buffers.
  • the video data of each GOB is allocated to a portion of random access memory (RAM) 56. Since intra- or inter- frame coding may be used in H.261 , the amount of video data within a GOB may vary significantly.
  • the video data of each GOB is allocated to a portion of random access memory (RAM) 56. Since intra- or inter- frame coding may be used in H.261 , the amount of video data within a GOB may vary significantly.
  • the video data of each GOB is allocated to a portion of random access memory (RAM) 56. Since intra- or inter- frame coding may be used in H.261 , the amount of video data within a GOB may vary significantly.
  • GOB is therefore allocated to a portion of RAM of sufficient capacity to hold the largest possible GOB allowed under H.261 .
  • the GOBs for a particular QCIF picture (which contains three GOBs) are logically grouped together in the RAM.
  • various codes associated with each GOB are also stored in the RAM. These codes relate to: the source of the data (i.e. from which terminal 1 2 the video originated); the picture number (PIC) of the current picture held in RAM from a particular source; the original group number (OGN)( 1 ,2,3) of the GOB in a particular PIC; the number of bytes (Nbyte) in the GOB; the valid data content (VFByte) of the first byte in a GOB; and the valid data content (VLByte) of the last byte in a GOB.
  • Also associated with each GOB is a number of pointers to locate the position of headers within the frame. These are used, for example, to locate the OGN codeword position for editing purposes prior to compilation of the video data to form a CIF format signal.
  • each new CIF picture-data sequence from the original individual constituent QCIF picture data stored in the RAM 56: • Assign an appropriate CIF Picture Header for the output CIF Frame; this is output ahead of the GOBs of data. • Edit each GOB Header code to conform to the new positions of each GOB in the CIF structure required for the given output to which the data is to be sent.
  • each portion of the RAM is polled by the control means 54 at the highest allowed H.261 picture rate, approximately 30 Hertz.
  • H.261 picture rate approximately 30 Hertz.
  • a complete frame of data for an individual QCIF signal from a terminal 1 2 is available it is transferred to an output data FIFO 57
  • an empty GOB of data i.e. just a header
  • the control means 54 monitors the status of the individual areas of the RAM to ensure that the above procedure is followed.
  • each CIF frame output is built up by transferring one GOB at a time to each output buffer 57 in turn before returning to the first to start again
  • data for several outputs tend to require the same input picture data at any one time in the CIF compilation sequence, allowing a large degree of parallelism to be employed in the data transfer.
  • the RAM 56 is of sufficient capacity to store a sequence of several QCIF frames of data from any single source if required, although in normal operation on average only two QCIF frames of data are required. Once an area of RAM has been transferred to all of the required output buffers 57 a-e then the area is made available for storing a new QCIF frame. New MB address Stuffing Codes are omitted or inserted to control the output data-rate to comply with H.261 for a CIF picture.
  • the output buffers 57 buffer the data being assembled from the original
  • the output buffers 57 are of sufficient capacity to aliow loading of data to take place without overflow, whilst providing the FEC encoders 58 with data when requested without underflow.
  • the flow of data into the buffers 57 and out of the buffers 59 and the FEC is controlled by an output control 62, which may also be a FPGA device.
  • the forward error corrected signal output from the encoders 58 is input to CIF output buffers 59a-e which buffer the CIF signals for transmission to the relevant participating terminal 1 2.
  • Each of the individual terminals 1 2 participating in a conference are autonomous. This means that there will tend to be different and varying amounts of information within each individual QCIF-coded picture; each terminal 1 2 will be operating at slightly different picture-rate tolerances ( ⁇ 50pmm); and each terminal 1 2 can produce a different picture-rate (through picture dropping) .
  • the last item potentially creates the biggest problem.
  • the possible options and alternatives available when combining pictures at different frame-rates into a larger picture of one frame-rate are discussed below.
  • the combined CIF picture is compiled from a maximum of four contributing QCIF pictures. If different picture rates are used by the different QCIF picture feeds, then the combined CIF picture may be formed for instance by either using the highest QCIF picture rate present or using a fixed pre-determined rate.
  • this rate may vary dynamically with the changing scene contents which are encoded by each participating terminal 1 2. It is possible to keep track of the highest current picture-rate and to modify the CIF output frame- rate accordingly.
  • the highest picture-rate possible (29.97 Hz), or some other pre-determined rate, is used to set the CIF output frame-rate.
  • individual QCIF data picture-rates would not be used to determine the output rate.
  • This option is slightly more wasteful of data-capacity than the previous option, requiring a larger 'overhead', but simplifies the operation of the apparatus and potentially allows for the use of each individual Temporal Reference (TR) code of an H.261 format signal.
  • TR Temporal Reference
  • the TR code can be used to determine the relative temporal position of each QCIF picture within a sequence of CIF frames, possibly leading to enhanced rendition of motion when displayed It may well be that one or more of the terminals 1 2 can only receive pictures at a particular lower rate In this case that lower rate will set the limit on the maximum allowable pre ⁇ determined CIF picture-rate for all participants, the controlling MCU 14 signalling this to all the participating terminals.
  • the MCU can impose a maximum picture rate on the contributing incoming feeds if necessary.
  • the newly formed CIF format signals have a mean data-rate that is the sum of the data rates of the constituent QCIF pictures, plus an additional 'overhead' capacity to cater for combining pictures with the different picture-rates (as discussed above).
  • Each CIF frame must contain all of the constituent GOB headers, even for any omitted data.
  • a proportionally higher data rate will be required on the output CIF channel, depending upon the picture-rate disparities between the incoming QCIF feeds. The following is an estimation of a 'worst- case' scenario, to determine the overhead required.
  • Each of the terminals 1 2 may allocate a different channel capacity (R) to video data for transmission to the MCU.
  • the image processor of the invention in the MCU produces a combined CIF coded video signal for transmission at the highest allowed picture-rate for the call. If no constraints are set, this will be 30 Hz (in fact 29.97 Hz ⁇ 50ppm) ; constraints can be sent from the MCU 14 (using, for example, H.221 format signalling) to lower this to say 1 5, 1 0 or 7 5 Hz if desired or required.
  • the down-link (from MCU to Terminal 1 2) channel capacities required are therefore the sum of the four QCIF capacities which will go to form the new CIF pictures, plus the overhead, Audio, Data, a frame alignment signal and a bit allocation signal.
  • each incoming H.261 coded QCIF picture is autonomous with its own unique data structure.
  • the internal structure is mn* M 97/36425
  • Picture Layer The individual constituent QCIF Macroblocks are assigned a location in the new CIF array of Macroblocks.
  • a new Picture Layer Picture Start Code (PSC) is assigned to conform to the new CIF format and a flag set which defines the source format (0: QCIF, 1 . CIF) to declare CIF for coded pictures output.
  • the Temporal Reference (TR) code could be taken as one of the contributing QCIF pictures 'averaged' from all of the contributions or used to temporally locate each QCIF segment of data into the new CIF frame.
  • GOB Layer The individual constituent QCIF Macroblocks are assigned a location in the new CIF array of Macroblocks.
  • a new Picture Layer Picture Start Code (PSC) is assigned to conform to the new CIF format and a flag set which defines the source format (0: QCIF, 1 . CIF) to declare CIF for coded pictures output.
  • the Temporal Reference (TR) code could be taken as one of the contributing QCIF pictures 'averaged' from all of
  • GN GOB header Group Number
  • a Macro-Block stuffing (MBA stuffing) code word is available and may be employed for 'padding out' the data content if desired.
  • FIG. 6 shows the resulting CIF pictures for a videoconference including five terminals.
  • Each CIF picture is formed from four QCIF pictures.
  • the last CIF picture in Figure 6 represents a combination of the QCIF signals from terminals number 1 , 2, 3 and 4 and is transmitted from the MCU to terminal no. 5.
  • terminal number 5 will display a composite image composed of images from all four other participating terminals 1 2.
  • the image processor of the invention can produce a CIF picture from one, two, three or four QCIF pictures. This method could also be used to combine CIF formatted pictures into "Multiple-CIF" formats (e.g. to combine four CIF images into one composite signal) to produce higher resolution pictures. It could similarly also be used, with only minor changes, to combine MPEG (H.262) pictures into multiple pictures.
  • the location information contained in the H.261 data headers may be edited to position individual picture segments anywhere within the available display field as desired. This can be used to produce a subjectively more pleasing arrangement of contributing QCIF pictures when there are less than four participants being displayed. For example, if the final CIF picture is compiled from only two contributing QCIF pictures, such as would be the case in a three-way conference, then it may be subjectively better to arrange the two pictures say side by side in the middle of the screen, rather than in any of the corners. This can easily be achieved by re-numbe ⁇ ng the constituent GOBs for each QCIF picture to occupy, for example, positions 3, 5, 7 and 4, 6, 8 in the CIF array. Alternatively, the images may be placed on top of each, at the top of the display etc.
  • a composite signal may be generated which represents more than four QCIF pictures
  • the resolution of a user's screen is 352 pixels by 288 lines and each participant terminal transmits to a central image processing apparatus according to the invention a full resolution (i.e. 352 x 288) picture.
  • a pre-processor 80 (as shown in Figure 8) then pre-processes each incoming signal to reduce its resolution by 50% in each dimension. (In Figure 8, like elements are indicated by the same reference numerals as Figure 5.)

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)
  • Compression Or Coding Systems Of Tv Signals (AREA)

Abstract

L'invention porte sur un appareil de traitement d'images comportant des moyens d'introduction (51) des signaux d'entrée de n terminaux de vidéoconférence, n étant un entier supérieur ou égal à 3, chacun de ces signaux d'entrée représentant les trames d'un signal vidéo; elle porte également sur des moyens de traitement servant à former n signaux composites représentant chacun différentes combinaisons d'au moins deux des signaux d'entrée, ainsi que sur des moyens de transmission des signaux composites vers le terminal de vidéoconférence concerné.
PCT/GB1997/000641 1996-03-28 1997-03-07 Traitement d'images video WO1997036425A1 (fr)

Priority Applications (2)

Application Number Priority Date Filing Date Title
AU21022/97A AU2102297A (en) 1996-03-28 1997-03-07 Video processing
JP9534101A JP2000507418A (ja) 1996-03-28 1997-03-07 ビデオ処理

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
GBGB9606511.5A GB9606511D0 (en) 1996-03-28 1996-03-28 Video processing
GB9606511.5 1996-03-28
EP96302148 1996-03-28
EP96302148.0 1996-03-28

Publications (1)

Publication Number Publication Date
WO1997036425A1 true WO1997036425A1 (fr) 1997-10-02

Family

ID=26143636

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/GB1997/000641 WO1997036425A1 (fr) 1996-03-28 1997-03-07 Traitement d'images video

Country Status (3)

Country Link
JP (1) JP2000507418A (fr)
AU (1) AU2102297A (fr)
WO (1) WO1997036425A1 (fr)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
FR2828055A1 (fr) * 2001-07-27 2003-01-31 Thomson Licensing Sa Procede et dispositif de codage d'une mosaique d'images
WO2003026300A1 (fr) * 2001-09-19 2003-03-27 Bellsouth Intellectual Property Corporation Procede de decodage minimal permettant de multiplexer spatialement des images video numeriques

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR100548383B1 (ko) 2003-07-18 2006-02-02 엘지전자 주식회사 이동통신 시스템의 디지털 비디오 신호처리 장치 및 방법
JP2024120350A (ja) * 2023-02-24 2024-09-05 Kddiアジャイル開発センター株式会社 データ処理装置、データ処理方法及びデータ処理システム

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0523629A1 (fr) * 1991-07-15 1993-01-20 Hitachi, Ltd. Système téléconférence multipoint employant des "H.221 frames"
EP0642271A1 (fr) * 1993-09-03 1995-03-08 International Business Machines Corporation Appareil de communication vidéo
EP0669765A2 (fr) * 1994-02-25 1995-08-30 AT&T Corp. Système de communication vidéo numérique multipoint

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0523629A1 (fr) * 1991-07-15 1993-01-20 Hitachi, Ltd. Système téléconférence multipoint employant des "H.221 frames"
EP0642271A1 (fr) * 1993-09-03 1995-03-08 International Business Machines Corporation Appareil de communication vidéo
EP0669765A2 (fr) * 1994-02-25 1995-08-30 AT&T Corp. Système de communication vidéo numérique multipoint

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
FR2828055A1 (fr) * 2001-07-27 2003-01-31 Thomson Licensing Sa Procede et dispositif de codage d'une mosaique d'images
WO2003013145A1 (fr) * 2001-07-27 2003-02-13 Thomson Licensing S.A. Procede et dispositif de codage d'une mosaïque d'images
US8160159B2 (en) 2001-07-27 2012-04-17 Thomson Licensing Method and device for coding a mosaic
WO2003026300A1 (fr) * 2001-09-19 2003-03-27 Bellsouth Intellectual Property Corporation Procede de decodage minimal permettant de multiplexer spatialement des images video numeriques
US6956600B1 (en) 2001-09-19 2005-10-18 Bellsouth Intellectual Property Corporation Minimal decoding method for spatially multiplexing digital video pictures
US7518630B2 (en) 2001-09-19 2009-04-14 At&T Intellectual Property I, L.P. Minimal decoding method for spatially multiplexing digital video pictures
US8872881B2 (en) 2001-09-19 2014-10-28 At&T Intellectual Property I, L.P. Minimal decoding method for spatially multiplexing digital video pictures
US9554165B2 (en) 2001-09-19 2017-01-24 At&T Intellectual Property I, L.P. Minimal decoding method for spatially multiplexing digital video pictures

Also Published As

Publication number Publication date
JP2000507418A (ja) 2000-06-13
AU2102297A (en) 1997-10-17

Similar Documents

Publication Publication Date Title
US5453780A (en) Continous presence video signal combiner
US5764277A (en) Group-of-block based video signal combining for multipoint continuous presence video conferencing
US7646736B2 (en) Video conferencing system
US5684527A (en) Adaptively controlled multipoint videoconferencing system
US6285661B1 (en) Low delay real time digital video mixing for multipoint video conferencing
US6535240B2 (en) Method and apparatus for continuously receiving frames from a plurality of video channels and for alternately continuously transmitting to each of a plurality of participants in a video conference individual frames containing information concerning each of said video channels
CA2159846C (fr) Adaptation des debits de transmission video dans les systemes de communication multimedia
US5838664A (en) Video teleconferencing system with digital transcoding
US5600646A (en) Video teleconferencing system with digital transcoding
CA2159847C (fr) Composition d'images a domaines codes pour les systemes de communicationmultimedia
EP1683356B1 (fr) Composeur de medias distribues en temps reel
CA2140849C (fr) Systeme de communication video numerique multipoint
US7245660B2 (en) Method and an apparatus for mixing compressed video
AU2002355089A1 (en) Method and apparatus for continuously receiving frames from a pluarlity of video channels and for alternatively continuously transmitting to each of a plurality of participants in a video conference individual frames containing information concerning each of said video channels
US7720157B2 (en) Arrangement and method for generating CP images
WO1997036425A1 (fr) Traitement d'images video
KR100194976B1 (ko) 비트스트림 편집 장치
Pao et al. Multipoint Videoconferencing
WO1997003522A1 (fr) Visioconference

Legal Events

Date Code Title Description
AK Designated states

Kind code of ref document: A1

Designated state(s): AL AM AT AU AZ BA BB BG BR BY CA CH CN CU CZ DE DK EE ES FI GB GE GH HU IL IS JP KE KG KP KR KZ LC LK LR LS LT LU LV MD MG MK MN MW MX NO NZ PL PT RO RU SD SE SG SI SK TJ TM TR TT UA UG US UZ VN YU AM AZ BY KG KZ MD RU TJ TM

AL Designated countries for regional patents

Kind code of ref document: A1

Designated state(s): GH KE LS MW SD SZ UG AT BE CH DE DK ES FI FR GB GR IE IT LU MC NL PT SE BF

DFPE Request for preliminary examination filed prior to expiration of 19th month from priority date (pct application filed before 20040101)
121 Ep: the epo has been informed by wipo that ep was designated in this application
REG Reference to national code

Ref country code: DE

Ref legal event code: 8642

122 Ep: pct application non-entry in european phase
NENP Non-entry into the national phase

Ref country code: CA