WO2020050016A1 - Dispositif et procédé de décryptage, dispositif et procédé de de cryptage, et programme - Google Patents
Dispositif et procédé de décryptage, dispositif et procédé de de cryptage, et programme Download PDFInfo
- Publication number
- WO2020050016A1 WO2020050016A1 PCT/JP2019/032539 JP2019032539W WO2020050016A1 WO 2020050016 A1 WO2020050016 A1 WO 2020050016A1 JP 2019032539 W JP2019032539 W JP 2019032539W WO 2020050016 A1 WO2020050016 A1 WO 2020050016A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- motion
- frame
- information
- unit
- current frame
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N19/00—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
- H04N19/10—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
- H04N19/102—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the element, parameter or selection affected or controlled by the adaptive coding
- H04N19/103—Selection of coding mode or of prediction mode
- H04N19/105—Selection of the reference unit for prediction within a chosen coding or prediction mode, e.g. adaptive choice of position and number of pixels used for prediction
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N19/00—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
- H04N19/10—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
- H04N19/169—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the coding unit, i.e. the structural portion or semantic portion of the video signal being the object or the subject of the adaptive coding
- H04N19/17—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the coding unit, i.e. the structural portion or semantic portion of the video signal being the object or the subject of the adaptive coding the unit being an image region, e.g. an object
- H04N19/172—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the coding unit, i.e. the structural portion or semantic portion of the video signal being the object or the subject of the adaptive coding the unit being an image region, e.g. an object the region being a picture, frame or field
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N19/00—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
- H04N19/42—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals characterised by implementation details or hardware specially adapted for video compression or decompression, e.g. dedicated software implementation
- H04N19/436—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals characterised by implementation details or hardware specially adapted for video compression or decompression, e.g. dedicated software implementation using parallelised computational arrangements
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N19/00—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
- H04N19/50—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using predictive coding
- H04N19/503—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using predictive coding involving temporal prediction
- H04N19/51—Motion estimation or motion compensation
- H04N19/513—Processing of motion vectors
- H04N19/517—Processing of motion vectors by encoding
- H04N19/52—Processing of motion vectors by encoding by predictive encoding
Definitions
- the generation of the Interpolated ⁇ MV ⁇ Field causes an increase in encoding / decoding processing time.
- a group of seven frames on the left indicates the processing of the first current frame.
- the group of seven frames on the right side shows the processing of the second current frame to be processed next to the first current frame.
- FIG. 3 is a flowchart illustrating the decoding process.
- a CU is set by dividing a picture into CTUs (Coding Tree Units) of a fixed size, and dividing the CTU into horizontal and vertical directions an arbitrary number of times.
- the maximum size of a CU is LCU (Largest @ Coding @ Unit), and the minimum size is SCU (Smallest @ Coding @ Unit).
- the CU is divided into PUs (Prediction @ Unit) and TUs (Transform @ Unit).
- the sizes of the CU, PU, and TU may be the same.
- step S22 a motion compensation process is performed with reference to the motion information map of the current frame, and a predicted image is generated.
- the reference frame list for motion compensation of the current frame becomes available only after decoding of the slice header for each slice ends.
- the closest one to the current frame in the display order means a frame whose display order is closest to the current frame.
- the plurality of nearest frames in which the upper limit of the current frame is set in the display order means frames up to a predetermined number counted from the frame closest to the current frame in display order.
- the predetermined number is a number greater than or equal to zero, and includes one.
- the prediction mode information is composed of intra prediction mode information indicating the type of intra prediction, and inter prediction mode information indicating the size of the current block.
- the inter prediction mode includes an inter prediction mode in which differential motion information is encoded and a merge mode in which differential motion information is not encoded.
- the lossless decoding unit 112 outputs the intra prediction mode information or the inter prediction mode information to the switch 125.
- the lossless decoding unit 112 outputs the offset filter information to the filter 116.
- the IMVF generation unit 123 assigns the extended motion vector as a motion candidate to a block at a passing position in the referenced frame through which the motion vector obtained by extending the current motion information passes. Thereby, the Interpolated ⁇ MV ⁇ Field of the referenced frame is generated. The IMVF generation unit 123 outputs the Interpolated ⁇ MV ⁇ Field of the referenced frame to the motion vector generation unit 122.
- the Interpolated ⁇ MV ⁇ Field is used as an example of a motion candidate in the merge mode or the inter prediction mode.
- the generation of the Interpolated ⁇ MV ⁇ Field and the motion compensation of the next block are performed in parallel with the start of the data decoding of the current block.
- the data decoding of the next block is performed in parallel with the start of the other data decoding of the current block.
- step S121 the motion vector generation unit 122 generates CU motion information with reference to the motion vector information of the decoded peripheral block or the Interpolated ⁇ MV ⁇ Field of the current frame.
- step S124 a data decoding process other than the motion information generation process and the motion compensation process is performed using the predicted image.
- the data decoding process will be described later with reference to FIG.
- step S124 the image obtained by adding the residual information subjected to the inverse quantization and the inverse orthogonal transform to the prediction image is subjected to a filtering process, and is rearranged in the display order. D / A conversion is performed on the rearranged images and output to a display device (not shown) or the like.
- step S141 the IMVF generation unit 123 determines whether decoding (or encoding) of a Neighbor frame has not been performed.
- Neighbor frames mean frames to be referred to, and have an upper limit set.
- step S141 If it is determined in step S141 that the Neighbor frame decoding process has not been performed, the process proceeds to step S142.
- Distance nA is represented as POC C -POC r . Further, the distance nB is expressed as POC n -POC c .
- the Interpolated ⁇ MV ⁇ Field for the referenced frame is generated based on the current motion information.
- step S162 the inverse orthogonal transform unit 114 performs an inverse orthogonal transform on the orthogonal transform coefficient supplied from the inverse quantization unit 113, and outputs the inverse orthogonal transformed residual information to the adding unit 115.
- step S167 the screen rearrangement buffer 117 stores the frames supplied from the filter 116.
- the screen rearrangement buffer 117 rearranges the frames in the processing order for encoding in the display order, and outputs the frames to the D / A converter 118.
- the “0P0” frame is a P-picture frame whose processing order is 0 and whose display order is 0.
- the “0P0” frame refers to the left frame.
- the “1P8” frame is a P-picture frame whose processing order is first and whose display order is eighth.
- the “1P8” frame refers to the “0P0” frame.
- the “7B5” frame is a frame of a B picture whose processing order is the seventh and which is a B picture and whose display order is the fifth.
- the “7B5” frame refers to the “2B4” frame and the “1P8” frame.
- the “8B7” frame is a frame of a B picture whose processing order is eighth and whose display order is seventh.
- the “8B7” frame refers to the “4B6” frame and the “1P8” frame.
- the ⁇ IMVF generation unit 123 generates the current motion information of the “3B2” frame by using the generated forward prediction and backward prediction Interpolated MV Field of the “3B2” frame, as indicated by the dashed arrow.
- ⁇ Interpolated ⁇ MV ⁇ Field of the forward prediction of the “3B2” frame is information generated after the generation processing of the current motion information of the “0P0” frame described above with reference to FIG.
- the Interpolated ⁇ MV ⁇ Field for backward prediction of the “3B2” frame is information generated after the generation processing of the current motion information of the “2B4” frame described above with reference to FIG.
- the IMVF generating unit 123 After generating the current motion information of the “3B2” frame, the IMVF generating unit 123 replaces the motion candidate in which the current motion information of the “3B2” frame is extended into the “5B1” frame of the backward prediction with the “5B1” frame of the backward prediction. It is determined where in the “5B1” frame has passed.
- the IMVF generation unit 123 generates an Interpolated ⁇ MV ⁇ Field of the backward predicted "5B1" frame by assigning the extended motion candidate to a block that has passed through the backward predicted "5B1" frame.
- the IMVF generation unit 123 converts the current motion information of the “4B6” frame into a motion candidate extended to the “8B7” frame of the forward prediction. It is determined where in the “8B7” frame has passed.
- the IMVF generation unit 123 generates the Interpolated ⁇ MV ⁇ Field of the forward predicted "8B7” frame by allocating the extended motion candidate to the block that has passed the forward predicted "8B7” frame.
- the IMVF generation unit 123 generates the current motion information of the “5B1” frame by using the generated forward prediction and backward prediction Interpolated MV Field of the “5B1” frame, as indicated by the dashed arrow.
- ⁇ Interpolated ⁇ MV ⁇ Field of the forward prediction of the “5B1” frame is information generated after the generation processing of the current motion information of the “0P0” frame described above with reference to FIG.
- the Interpolated ⁇ MV ⁇ Field for backward prediction of the “5B1” frame is information generated after the generation processing of the current motion information of the “3B2” frame described above with reference to FIG.
- the IMVF generation unit 123 determines that the current motion information for the “5B1” frame has been extended to the forward prediction “6B3” frame. It is determined where in the “6B3” frame has passed. The IMVF generation unit 123 generates the Interpolated ⁇ MV ⁇ Field of the forward predicted "6B3" frame by allocating the extended motion candidate to the block that has passed the forward predicted "6B3" frame.
- the IMVF generation unit 123 generates the current motion information of the “7B5” frame by using the generated forward prediction and backward prediction Interpolated MV Field of the “7B5” frame, as indicated by the dashed arrow.
- the Interpolated ⁇ MV ⁇ Field of the forward prediction of the “7B5” frame is information generated after the current motion information generation processing of the “2B4” frame described in FIG. 12 or the “6B3” frame described above in FIG.
- the Interpolated ⁇ MV ⁇ Field of the backward prediction of the “7B5” frame is information generated after the current motion information generation processing of the “4B6” frame described above with reference to FIG.
- Assignment Map> When using an assignment map in generating an Interpolated MV Field, one assignment map is added to one Interpolated MV Field.
- the setting of the priority order described above may be fixed depending on the operation mode, or a flag indicating the priority order may be included in the encoded data. As a result, it is possible to select a priority ordering method suitable for the processing system from the viewpoint of image quality, memory access, processing time, and the like.
- the encoding device 201 includes an A / D conversion unit 211, a screen rearrangement buffer 212, an operation unit 213, an orthogonal transformation unit 214, a quantization unit 215, a lossless encoding unit 216, a storage buffer 217, an inverse quantization unit 218, and an inverse It has an orthogonal transformation unit 219 and an addition unit 220. Further, the encoding device 201 includes a filter 221, a frame memory 222, a switch 223, an intra prediction unit 224, a motion prediction / compensation unit 225, an IMVF generation unit 226, a predicted image selection unit 227, a rate control unit 228, and a setting unit 229. Having.
- step S221 the motion prediction / compensation unit 225 performs a motion prediction / compensation process on the current block. Specifically, the motion prediction / compensation unit 225 reads a reference frame candidate from the frame memory 222 via the switch 223 for all inter prediction modes, and detects a motion vector (RefIdx, MV).
- a motion vector RefIdx, MV
- step S249 the frame memory 222 stores the current frame supplied from the adding unit 220 in the cache, and stores the coded picture supplied from the filter 221 in the DRAM. Pixels adjacent to the current block in the current frame stored in the cache are output to the intra prediction unit 224 via the switch 223 as peripheral pixels. The current frame stored in the cache and the encoded frame stored in the DRAM are output to the motion prediction / compensation unit 225 via the switch 223 as a reference frame.
- step S254 the accumulation buffer 217 outputs the stored encoded data to the subsequent stage. Then, the process returns to step S224 in FIG. 20, and ends the encoding process.
- the input / output interface 305 is further connected to the bus 304.
- the input / output interface 305 is connected to an input unit 306, an output unit 307, a storage unit 308, a communication unit 309, and a drive 310.
- one step includes a plurality of processes
- the plurality of processes included in the one step can be executed by one device or can be shared and executed by a plurality of devices.
- the decoding device determines a motion candidate used for generating the motion candidate information based on the motion information.
- the motion candidate information generating unit when a plurality of the candidate motion information exist in one reference frame, the first motion information having a small motion, and the second motion information having a decoding order of the current frame preceding the current frame.
- the decoding device wherein the motion candidate information is generated by giving priority to any one of the motion information and the third motion information in which the decoding processing order is later than the current frame.
- the motion candidate information generating unit when the candidate motion information is present in each of the plurality of reference frames, first motion information of a frame whose display order is closer to the current frame, the current frame The second motion information of the frame whose decoding processing order is earlier than the current frame, and the third motion information of the frame whose decoding processing order is closer to the current frame.
- the decoding device according to (4).
- a setting unit that sets a priority order of the first to third motion information according to an improvement in accuracy of the motion candidate or according to a load reduction by referring to the motion candidate;
- the motion candidate information generation unit according to (8) wherein the motion candidate information generation unit generates the motion candidate information by giving priority to any of the first to third motion information based on the set priority order.
- Decoding device when the candidate motion information is present in each of the plurality of reference frames, first motion information of a frame whose display order is closer to the current frame, the current frame The second motion information of the frame whose decoding processing order is earlier than the current frame, and the third motion information
- the setting unit sets the priority of the first motion information to be higher when the priority is to improve the accuracy of the motion candidate, and sets the second motion information when the load is reduced by referring to the motion candidate.
- the decoding device according to (9), wherein the priority of the third motion information is set higher.
- the motion candidate information generation unit holds assignment information indicating whether or not the motion candidate has been determined for a predetermined block in the current frame, and based on the assignment information, determines the motion candidate of the predetermined block based on the assignment information.
- the decoding device according to any one of (4) to (10).
- the decryption device Generating motion candidate information of the current frame based on the motion information of the decoded reference frame among the reference frames up to a predetermined number counted from the reference frame whose display order is closest to the current frame.
- Method. (13) A motion that generates motion candidate information of the current frame based on motion information of the decoded reference frame among the reference frames up to a predetermined number counted from the reference frame whose display order is closest to the current frame.
- a program that makes a computer function As a candidate information generation unit.
- the motion candidate information of the current frame is generated based on the motion information of the encoded reference frame among the reference frames up to a predetermined number counted from the reference frame whose display order is closest to the current frame.
- ⁇ 101 ⁇ decoding device ⁇ 112 ⁇ lossless decoding unit, ⁇ 122 ⁇ motion vector generation unit, 123 ⁇ IMVF generation unit, ⁇ 124 ⁇ motion compensation unit, ⁇ 201 ⁇ encoding device, ⁇ 216 ⁇ lossless encoding unit, ⁇ 225 ⁇ motion prediction / compensation unit, ⁇ 226 ⁇ IMVF generation unit
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Computing Systems (AREA)
- Theoretical Computer Science (AREA)
- Compression Or Coding Systems Of Tv Signals (AREA)
Abstract
La présente technologie concerne un dispositif et un procédé de décryptage, un dispositif et un procédé de cryptage, et un programme qui permettent la réalisation d'opérations parallèles de pipeline dans des unités de blocs de traitement de cryptage ou de décryptage. Une unité de génération IMVF génère des informations de candidat de mouvement d'une trame courante sur la base d'informations de mouvement d'une trame de référence déjà décryptée, parmi jusqu'à un nombre prescrit de trames de référence en comptant à partir de la trame de référence la plus proche de l'ordre d'affichage à la trame courante. La présente invention est applicable à un dispositif de cryptage ou à un dispositif de décryptage.
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| JP2018165042A JP2021192468A (ja) | 2018-09-04 | 2018-09-04 | 復号装置および方法、符号化装置および方法、並びにプログラム |
| JP2018-165042 | 2018-09-04 |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2020050016A1 true WO2020050016A1 (fr) | 2020-03-12 |
Family
ID=69722557
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/JP2019/032539 Ceased WO2020050016A1 (fr) | 2018-09-04 | 2019-08-21 | Dispositif et procédé de décryptage, dispositif et procédé de de cryptage, et programme |
Country Status (2)
| Country | Link |
|---|---|
| JP (1) | JP2021192468A (fr) |
| WO (1) | WO2020050016A1 (fr) |
Citations (1)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| WO2018105582A1 (fr) * | 2016-12-09 | 2018-06-14 | パナソニック インテレクチュアル プロパティ コーポレーション オブ アメリカ | Dispositif de codage, dispositif de décodage, procédé de codage et procédé de décodage |
-
2018
- 2018-09-04 JP JP2018165042A patent/JP2021192468A/ja active Pending
-
2019
- 2019-08-21 WO PCT/JP2019/032539 patent/WO2020050016A1/fr not_active Ceased
Patent Citations (1)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| WO2018105582A1 (fr) * | 2016-12-09 | 2018-06-14 | パナソニック インテレクチュアル プロパティ コーポレーション オブ アメリカ | Dispositif de codage, dispositif de décodage, procédé de codage et procédé de décodage |
Non-Patent Citations (2)
| Title |
|---|
| CHEN, JIANLE ET AL.: "Algorithm Description of Joint Exploration Test Model 5 (JEM 5)", JOINT VIDEO EXPLORATION TEAM (JVET) OF ITU-T SG 16 WP 3 AND ISO/IEC JTC 1/SC 29/WG11 5TH MEETING, no. JVET-E1001-v2, 11 February 2017 (2017-02-11), Geneva, CH, pages 15 - 27, XP030150648 * |
| LI JINGYA ET AL.: "CE9: Bilateral matching (Test 9.2.3)", JOINT VIDEO EXPLORATION TEAM (JVET) OF ITU-T SG 16 WP 3 AND ISO/IEC JTC 1/SC 29/WG11 11TH MEETING, no. JVET-K0177, 3 July 2018 (2018-07-03), Ljibliana, SL, XP030198811 * |
Also Published As
| Publication number | Publication date |
|---|---|
| JP2021192468A (ja) | 2021-12-16 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| CN113259661B (zh) | 视频解码的方法和装置 | |
| JP2024157018A (ja) | 映像符号化方法、映像復号化方法、および記録媒体 | |
| US9066104B2 (en) | Spatial block merge mode | |
| KR102257187B1 (ko) | 참조 화상 리스트 변경 정보를 조건부로 시그널링하는 기법 | |
| US10652570B2 (en) | Moving image encoding device, moving image encoding method, and recording medium for recording moving image encoding program | |
| CN110809887A (zh) | 用于多参考预测的运动矢量修正 | |
| US11102494B2 (en) | Method for scanning transform coefficient and device therefor | |
| JP2017184266A (ja) | 強化されたcabac復号を用いた画像復号装置 | |
| US20130182768A1 (en) | Method and apparatus for encoding / decoding video using error compensation | |
| US10171809B2 (en) | Video encoding apparatus and video encoding method | |
| CN114071158B (zh) | 视频编解码中的运动信息列表构建方法、装置及设备 | |
| JP7637790B2 (ja) | 一般化サンプルオフセットの適応的適用 | |
| JP2024003175A (ja) | 画像復号装置、画像復号方法及びプログラム | |
| JP2025100668A (ja) | ビデオコーディングのための方法および装置 | |
| WO2024147277A1 (fr) | Dispositif de décodage de maillage, procédé de décodage de maillage et programme | |
| CN107071406B (zh) | 运动图像解码方法和编码方法 | |
| JPWO2016194380A1 (ja) | 動画像符号化装置、動画像符号化方法および動画像符号化プログラムを記憶する記録媒体 | |
| JP6962193B2 (ja) | 動画像符号化装置、動画像符号化方法および動画像符号化プログラムを記憶する記録媒体 | |
| CN116805969A (zh) | 视频编解码方法、装置、计算机可读介质及电子设备 | |
| WO2020050016A1 (fr) | Dispositif et procédé de décryptage, dispositif et procédé de de cryptage, et programme | |
| KR101841352B1 (ko) | 참조 프레임 선택 방법 및 그 장치 | |
| KR101688085B1 (ko) | 고속 인트라 예측을 위한 영상 부호화 방법 및 장치 | |
| CN112313957B (zh) | 视频编码或解码装置、视频编码或解码方法、程序和记录介质 | |
| JPWO2010001832A1 (ja) | 動画像予測符号化装置および動画像予測復号化装置 | |
| JP2018037936A (ja) | 画像符号化装置および画像復号装置 |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 19856529 Country of ref document: EP Kind code of ref document: A1 |
|
| NENP | Non-entry into the national phase |
Ref country code: DE |
|
| 122 | Ep: pct application non-entry in european phase |
Ref document number: 19856529 Country of ref document: EP Kind code of ref document: A1 |
|
| NENP | Non-entry into the national phase |
Ref country code: JP |