[go: up one dir, main page]

GB2383506A - Trellis decoding in parallel where extra trellis sections are appended - Google Patents

Trellis decoding in parallel where extra trellis sections are appended Download PDF

Info

Publication number
GB2383506A
GB2383506A GB0130752A GB0130752A GB2383506A GB 2383506 A GB2383506 A GB 2383506A GB 0130752 A GB0130752 A GB 0130752A GB 0130752 A GB0130752 A GB 0130752A GB 2383506 A GB2383506 A GB 2383506A
Authority
GB
United Kingdom
Prior art keywords
trellis
metrics
section
decoding
length
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
GB0130752A
Other versions
GB0130752D0 (en
Inventor
Timothy Fisher-Jeffes
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Aeroflex Cambridge Ltd
Original Assignee
Ubinetics Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Ubinetics Ltd filed Critical Ubinetics Ltd
Priority to GB0130752A priority Critical patent/GB2383506A/en
Publication of GB0130752D0 publication Critical patent/GB0130752D0/en
Priority to AU2002356288A priority patent/AU2002356288A1/en
Priority to PCT/GB2002/005622 priority patent/WO2003056708A1/en
Publication of GB2383506A publication Critical patent/GB2383506A/en
Withdrawn legal-status Critical Current

Links

Classifications

    • HELECTRICITY
    • H03ELECTRONIC CIRCUITRY
    • H03MCODING; DECODING; CODE CONVERSION IN GENERAL
    • H03M13/00Coding, decoding or code conversion, for error detection or error correction; Coding theory basic assumptions; Coding bounds; Error probability evaluation methods; Channel models; Simulation or testing of codes
    • H03M13/37Decoding methods or techniques, not specific to the particular type of coding provided for in groups H03M13/03 - H03M13/35
    • H03M13/39Sequence estimation, i.e. using statistical methods for the reconstruction of the original codes
    • H03M13/3905Maximum a posteriori probability [MAP] decoding or approximations thereof based on trellis or lattice decoding, e.g. forward-backward algorithm, log-MAP decoding, max-log-MAP decoding
    • HELECTRICITY
    • H03ELECTRONIC CIRCUITRY
    • H03MCODING; DECODING; CODE CONVERSION IN GENERAL
    • H03M13/00Coding, decoding or code conversion, for error detection or error correction; Coding theory basic assumptions; Coding bounds; Error probability evaluation methods; Channel models; Simulation or testing of codes
    • H03M13/37Decoding methods or techniques, not specific to the particular type of coding provided for in groups H03M13/03 - H03M13/35
    • H03M13/3723Decoding methods or techniques, not specific to the particular type of coding provided for in groups H03M13/03 - H03M13/35 using means or methods for the initialisation of the decoder
    • HELECTRICITY
    • H03ELECTRONIC CIRCUITRY
    • H03MCODING; DECODING; CODE CONVERSION IN GENERAL
    • H03M13/00Coding, decoding or code conversion, for error detection or error correction; Coding theory basic assumptions; Coding bounds; Error probability evaluation methods; Channel models; Simulation or testing of codes
    • H03M13/37Decoding methods or techniques, not specific to the particular type of coding provided for in groups H03M13/03 - H03M13/35
    • H03M13/39Sequence estimation, i.e. using statistical methods for the reconstruction of the original codes
    • H03M13/3972Sequence estimation, i.e. using statistical methods for the reconstruction of the original codes using sliding window techniques or parallel windows
    • HELECTRICITY
    • H03ELECTRONIC CIRCUITRY
    • H03MCODING; DECODING; CODE CONVERSION IN GENERAL
    • H03M13/00Coding, decoding or code conversion, for error detection or error correction; Coding theory basic assumptions; Coding bounds; Error probability evaluation methods; Channel models; Simulation or testing of codes
    • H03M13/37Decoding methods or techniques, not specific to the particular type of coding provided for in groups H03M13/03 - H03M13/35
    • H03M13/39Sequence estimation, i.e. using statistical methods for the reconstruction of the original codes
    • H03M13/41Sequence estimation, i.e. using statistical methods for the reconstruction of the original codes using the Viterbi algorithm or Viterbi processors
    • HELECTRICITY
    • H03ELECTRONIC CIRCUITRY
    • H03MCODING; DECODING; CODE CONVERSION IN GENERAL
    • H03M13/00Coding, decoding or code conversion, for error detection or error correction; Coding theory basic assumptions; Coding bounds; Error probability evaluation methods; Channel models; Simulation or testing of codes
    • H03M13/37Decoding methods or techniques, not specific to the particular type of coding provided for in groups H03M13/03 - H03M13/35
    • H03M13/39Sequence estimation, i.e. using statistical methods for the reconstruction of the original codes
    • H03M13/41Sequence estimation, i.e. using statistical methods for the reconstruction of the original codes using the Viterbi algorithm or Viterbi processors
    • H03M13/4161Sequence estimation, i.e. using statistical methods for the reconstruction of the original codes using the Viterbi algorithm or Viterbi processors implementing path management
    • H03M13/4169Sequence estimation, i.e. using statistical methods for the reconstruction of the original codes using the Viterbi algorithm or Viterbi processors implementing path management using traceback
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L1/00Arrangements for detecting or preventing errors in the information received
    • H04L1/004Arrangements for detecting or preventing errors in the information received by using forward error control
    • H04L1/0045Arrangements at the receiver end
    • H04L1/0054Maximum-likelihood or sequential decoding, e.g. Viterbi, Fano, ZJ algorithms
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L1/00Arrangements for detecting or preventing errors in the information received
    • H04L1/004Arrangements for detecting or preventing errors in the information received by using forward error control
    • H04L1/0045Arrangements at the receiver end
    • H04L1/0055MAP-decoding

Landscapes

  • Physics & Mathematics (AREA)
  • Probability & Statistics with Applications (AREA)
  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Signal Processing (AREA)
  • Artificial Intelligence (AREA)
  • Error Detection And Correction (AREA)

Abstract

A trellis 30 representing a received signal sequence to be decoded using a Viterbi technique is divided into lengths 32-38 for processing in parallel. Where reliable metrics do not exist for the start (or end) of a length, the length is prefixed (or appended) with an extra section w of the trellis. As metrics are calculated (or trace-back is processed) through an extended length, the results become reliable by the end of the extra section w . A similar technique (Figure 7) is used to parallelise Log-MAP decoding.

Description

<Desc/Clms Page number 1>
TRELLIS DECODING The invention relates to methods of, and apparatus for, performing trellis decoding.
Trellis decoders are used to decode received data sequences in telecommunication systems. Trellis decoders come in two distinct algorithmic forms, i. e. Viterbi algorithms (SOVA and Viterbi decoders) and Log MAP algorithms (Log MAP and Max Log MAP decoders). The invention is, at the least, applicable to trellis decoders of the aforementioned types.
As is well known, a trellis decoder processes recursively through a received bit sequence and generates the best estimates of the actual bits that were encoded by the transmitter.
The decoding process starts at one end of the received sequence using the a-priori information that the encoder started in the all zero state at the beginning of the received sequence. Similarly the decoding process ends at the other end of the received sequence using the a-priori information that the encoder was brought back to the all zero state again at the end of the received sequence.
In a trellis decoder, probability information, usually called a metric, is calculated for each of the possible paths leading to the states at each stage of the trellis. Hence these probabilities tend to be associated with the state rather than the path that leads to this state.
The labels"state"and"path"are often used interchangeably for these metrics. As is well known, the metrics are developed recursively using add-compare-select (ACS) operations.
The recursion is initialised at the either end of the received sequence using the a-priori information that encoding started or ended at the all zero state. This involves setting the zero-th state metric to zero and all other metrics to-oo. The metrics are used in a well known manner to decode the received sequence. The way in which the metrics are calculated and the way in which they are then used in the decoding process depends upon whether the decoding process uses a Viterbi or Log MAP algorithm.
Figure 1 illustrates a portion of a trellis in Viterbi decoding. Figure 1 shows two consecutive stages of the trellis k, k+ 1. The metrics for each of the states 000 to 111 of stage k are used to calculate the metrics for the states in stage k+ 1 of the trellis. The
<Desc/Clms Page number 2>
calculation of the metric for state 001 of stage k+ 1 will now be discussed briefly. The metric to be calculated is nix, k+], denoting that the metric belongs to state x (=1) and at stage k+l. From Figure 1, it will be noted that metric nix, k+ 1 is calculated using an add compare select operation involving metrics for states y and z in stage k. Figure 1 also illustrates how add compare select operations can be paired in a butterfly calculation
(shown in bold). Metric mx, k+] is calculated using metrics my, k and mz, k and incremental reliability information y y and y,, z, i. e :
mx, k+1 = max (my, k + Yl, y, k, mz, k + Yl, z, k) + j {YI, y, k, Yl, z, k)
Where vis, l is the 1"'hypothetical encoded bit given the hypothetical state s and the hypothetical input bit I and R is the rate of the code, i. e. how many output bits are generated by the encoder for each input bit. Incremental metrics y are effectively correlations between the hypothetical encoded sequence given state s and input i and the actual received encoded sequence.
Trace-back information is recorded for each state of stage k+ 1 allowing the outcomes of the add compare select operations to be recorded. For example, if the ACS operation producing metric mu. ksi bases metric nix, k+l on metric my, k then the trace back information for state x of stage k+l of the trellis is a zero. If, on the other hand, metric mx, k+i is based on metric mi k then the trace back information stored for stage x of stage k+ 1 is a 1. The trace-back information is the last significant bit of the binary number identifying the state at stage k which wins the ACS operation.
After the metrics for stage k+l have been calculated, the metrics for stage k are discarded, and the metrics for stage k+1 are then used to generate the metrics for stage k+2 and the
<Desc/Clms Page number 3>
associated trace-back information, and so on. In this way, trace-back information can be calculated for each state at each stage of the trellis. Upon reaching the end of the trellis for the received sequence, the trace back information is followed backwards from the all zero state of the last stage of the trellis to decode the received sequence.
Log MAP algorithms work in a similar way, as will now be discussed with reference to Figures 2 and 3. In Log MAP decoding, metrics of a first kind are calculated for each stage of the trellis starting with the beginning of the received sequence and ending at the end of the received sequence, and for argument's sake these metrics will be called forward metrics since they recurse in the direction of the received sequence. These metrics are calculated in the same way as the metrics m in the Viterbi decoding, but are assigned the symbol a to distinguish them from other metrics that are calculated in Log MAP decoding (as will shortly be described). The a metrics are known as forward metrics. For example, Figure 2 illustrates the calculation of metric os, ksi by performing an ACS operation on metrics ay, k and o. z, k. Whereas in Viterbi decoding m metrics are discarded subsequent to forming the metrics of the next stage, a metrics are stored for each stage of the trellis and no trace-back information is recorded.
As mentioned above, an additional set of metrics are determined in Log MAP decoding. These are the metrics and for argument's sake these will be called reverse metrics. To generate the metrics, the trellis is processed from its end to its beginning, i. e. in the reverse direction to the processing performed to generate the a metrics. The P metrics are generated using add compare select operations in much the same way as the a and m
metrics. Figure 3 illustrates the generation of the metric ssy, k from the metrics px. k+i and pw, k+] in an ACS operation. Again, P metrics are stored for each stage of the trellis. Once a and ss metrics have been determined for each stage of the trellis, the a and P metrics are used to calculate log-likelihood ratios in the process of decoding the received sequence represented by the trellis.
In a real implementation of a trellis decoder, the amount of memory to store either trace-back information (in Viterbi decoding) or forward and reverse metric values (in Log MAP decoding) is a linear function of the trellis length and in almost all practical cases it
<Desc/Clms Page number 4>
becomes infeasible to decode the whole trellis in one decode operation since the memory requirement is too onerous. The memory requirement can be reduced by storing trace-back information or forward and reverse metrics over a window length L of a total trellis of length B, using the stored information to decode the portion of the received sequence corresponding to the window length and then making the memory available for reuse. The process of performing trellis decoding on window lengths will now be described for each of Viterbi and Log MAP decoding.
Figure 4 illustrates the Viterbi decoding of a trellis 10 of length B over four consecutive window lengths 12,14, 16 and 18. The decoding process begins normally at the start of the trellis, i. e. at the start of window length 12. M values and trace-back data are calculated for each stage of window length 12 in the normal way. However, the process does not stop after a length L of the trellis but continues until length L+6. In other words, metrics and trace-back data are calculated through an extra length 8 of the trellis at the end of the window length 12. 8 is known as the path truncation length.
The metric values produced for the stage at length L+ of the trellis are evaluated and the largest metric is selected. The trace-back data is followed back from the selected metric to stage 0 of the window length to decode the portion of the received signal sequence corresponding to the first window length 12.
In order to decode window length 14, the memory is reused although the trace-back data for states L to L+8 are retained in the memory since they belong to window length 14. Using the state metrics calculated for stage L+8, state metrics are calculated for the stages of window length 14 and the resulting trace-back data is placed in the memory. Again, a path truncation length S is appended to the window length and processing continues up to stage 2L+8. The trace-back data is again followed back from the largest metric at stage 2L+6 to decode window length 14 of the trellis. Window length 16 of the trellis is process in like manner to window length 14, and final window length 18 is decoded using the a-priori knowledge that the state occupied at the final stage is the all zero state. It is apparent that even though the trellis 10 has been divided into lengths 12 to 18, the window
<Desc/Clms Page number 5>
lengths must be processed in the order 12,14, 16 and 18, i. e. the window lengths must be processed in series.
Figure 5 illustrates the Log MAP decoding of a trellis 20 in separate window lengths 22, 24,26 and 28. The process of generating the forward and reverse metrics begins with window length 22. In the normal way, the forward metrics (a) are generated for each stage of window length 22 from the beginning of the trellis up to the stage at position L. The reverse metrics are generated by beginning at position L+8 rather than position L of window length 22, i. e. a path truncation length 8 is appended to the end of window length 22 and the ss metrics are calculated beginning from the L+8 end of the extended window length 22. The ss metrics for the states at the stage lying at position L+5 of the trellis are all set to the same value. As the ss metrics of successive stages of the window length 22 are generated by recursively processing beginning from position L+8, it has been shown that if is more than 5K then by the time position L is reached, the ss metrics have become reliable in the sense that in good signal-to-noise conditions the largest ss-metric is more likely to be the one associated with the correct state at this stage in the trellis. Secondly, the differences between ss-metrics will be close to those that would have been calculated had one started at the far end (stage 4L) of the trellis.
Once a and ss metrics have been recorded for window length 22, they are used to decode this portion of the received sequence and are then discarded, with the exception of the a metrics for position L of window length 22, which are used as the a metrics of the beginning of window length 24. The a metrics for window length 24 are calculated forward in the same manner as for window length 22. The P metrics for window length 24 are again calculated by appending a path truncation length 8 to the end of window length 24 and recursively processing backward from the 2L+8 position at the end of the extended window length 24. Thus, the window lengths are processed sequentially until the end of the trellis is reached. Note that a path truncation length does not need to be added to the end of window length 28 since the metrics for the 4L end of the window length 28 are known because the trellis terminates in the all zero state.
<Desc/Clms Page number 6>
One aim of the invention is to provide an improved trellis decoding technique.
According to one aspect the invention provides a method of decoding a section of a trellis representing a received signal sequence, comprising modifying an end of the section by attaching an extra trellis portion to it, calculating metrics through the section beginning at the modified end, calculating trace-back data for the section from the metrics and using the trace-back data to decode the section.
The invention also consists in apparatus for decoding a section of a trellis representing a received signal sequence, comprising means for modifying an end of the section by attaching an extra trellis portion to it, means for calculating metrics through the section beginning at the modified end, means for calculating trace-back data for the section from the metrics and means for using the trace-back data to decode the section.
According to another aspect, the invention provides a method of decoding a section of a trellis representing a received signal sequence, comprising modifying an end of the section by attaching an extra trellis portion to it, calculating first metrics through the section beginning at the modified end, calculating second metrics through the section in the opposite direction and decoding the section using the first and second metrics.
The invention also consists in apparatus for decoding a section of a trellis representing a received signal sequence, comprising means for modifying an end of the section by attaching an extra trellis portion to it, means for calculating first metrics through the section beginning at the modified end and means for calculating second metrics through the section in the opposite direction and means for decoding the section using the first and second metrics.
The invention allows the metrics for a length of trellis to be calculated in the absence of metric values being provided for the start of the length. This facilitates the processing of several lengths of a trellis in parallel which means that the entire trellis can be decoded in a shorter time.
<Desc/Clms Page number 7>
In one embodiment, the trellis decoding scheme employed is a Viterbi technique and the metrics are m metrics. In another embodiment, the trellis decoding technique is a Log MAP scheme and the metrics are a metrics.
The invention also extends to a computer program for performing a trellis decoding method according to the invention. Such programs can be held on any suitable storage device or in any suitable storage medium.
The trellis decoding scheme according to the invention operates on a received signal sequence, and the received signal sequence may be, for example, a received telecommunications signal or a high speed output from a storage device such as a hard disk drive.
By way of example only, certain embodiments of the invention will now be described with reference to the accompanying figures, in which: Figure 1 illustrates the calculation of metrics in a Viterbi decoding scheme; Figure 2 illustrates the calculation of forward metrics in a Log MAP decoding scheme; Figure 3 illustrates the calculation of reverse metrics in a Log MAP decoding scheme ; Figure 4 illustrates Viterbi decoding via a serial window technique; Figure 5 illustrates Log MAP decoding via a serial window technique ; Figure 6 illustrates Viterbi decoding via a parallel window technique; Figure 7 illustrates Log MAP decoding via a parallel window technique; and Figure 8 illustrates an architecture for performing parallelised Viterbi or Log MAP decoding.
<Desc/Clms Page number 8>
Figure 6 illustrates the partitioning of a trellis 30 into window lengths 32, 34, 36 and 38 for Viterbi processing in parallel. The processing of window length 32 is the same as that for window length 12 in Figure 4. Since the other window lengths 34,36 and 38 are processed in parallel with window length 32, trace-back information provided by the path truncation length 8 at the end of window length 32 cannot be used to provide reliable initial metrics m for the start of window length 34.
To overcome this problem, a path truncation length 8 is prefixed to window length 34.
Thus, the metrics m are computed forward from position L-8 which is effectively the initial stage of window length 34. In the initial stage, at L-8, the metrics for all of the states are set to the same value. By the time the metrics have been calculated as far forward as position L, i. e. at the end of the prefixed path truncation length, the metrics have become reliable and trace-back data is stored for each of the metrics from position L onwards. Note that, as with window length 32, a path truncation length 8 is appended to window length 34. The metrics are computed up to the end of the appended path truncation length at position 2L+, with the trace-back data being stored along the way. The largest metric at position 2L+8 is then selected and used to trace a path back through window length 34 in order to decode the received data associated with this window length.
It will be apparent that window length 36 is processed in a similar way with two path truncation lengths being added to the window length 36, one at either end. Note, however, that the processing of window length 38 is slightly different because a path truncation length is not appended to the end of window length 38 because the end of window length 38 is the end of the trellis.
Figure 7 illustrates the division of a trellis 40 into window lengths 42,44, 46 and 48 for Log MAP decoding in parallel. Here, the reverse metrics are calculated in the same way as in Figure 5 by appending path truncation lengths to the end of each of the window lengths 42,44 and 46 to allow the P metrics to be processed recursively backwards to provide reliable ss metrics by the time the original end, positions (L, 2L, 3L) of the window lengths are reached. Because the window lengths 42,44, 46 and 48 are processed in parallel,
<Desc/Clms Page number 9>
reliable a metrics do not exist for passing to the beginnings of window lengths 44,46 and 48. To overcome this, each of window lengths 44,46 and 48 is prefixed with a path truncation length S. In window length 44, the a metrics are calculated by setting all of the a metrics at position L-8 to the same value and recursively processing the a metrics forward to provide reliable a metrics by the time the true beginning of the window length 44 is encountered at position L. The a metrics are calculated in a similar manner for path lengths 46 and 48.
Above, with reference to Figures 6 and 7, schemes for processing window lengths in parallel have been described for both Viterbi and Log MAP decoding. Figure 8 schematically illustrates an architecture for performing the parallel decoding. Essentially, the received sequence R is operated on using a number of processors P, to Pn. The processors P, to Pn each operate on a respective window length of the trellis of the received sequence R and each produce a corresponding decoded datastream di to dn. The outputs of the processors Pi to Pn are then combined to produce the decoded data sequence D corresponding to the received sequence R. Each of the processors P, to Pn has a respective memory Mi to Mn. The memories Mi to Mn are used by the respective processors P, to Pn during the decode operation. For example, where the architecture is used to perform parallel Viterbi decoding, the memories store inter alia trace-back data, and where the architecture is used to perform Log MAP decoding the memories are used to store inter
alia a and p metrics. In the foregoing embodiments, the number of trellis stages in 8 is preferably at least equal to 5 times constraint length K of the encoding scheme used to generate the received signal, although the number of trellis stages in 8 may of course be varied.

Claims (13)

  1. CLAIMS 1. A method of decoding a section of a trellis representing a received signal sequence, comprising modifying an end of the section by attaching an extra trellis portion to it, calculating metrics through the section beginning at the modified end, calculating trace-back data for the section from the metrics and using the trace-back data to decode the section.
  2. 2. A method of decoding a section of a trellis representing a received signal sequence, comprising modifying an end of the section by attaching an extra trellis portion to it, calculating first metrics through the section beginning at the modified end, calculating second metrics through the section in the opposite direction and decoding the section using the first and second metrics.
  3. 3. A method according to claim 1 or 2, further comprising, before calculating metrics, modifying the other end of the section by attaching a second extra trellis portion to it.
  4. 4. A method of decoding a trellis comprising decoding sections of the trellis in parallel, wherein a section is decoded by a method according to claim 1,2 or 3 and another section is decoded by a method according to claim 1,2 or 3.
  5. 5. A method of decoding a trellis in sections, comprising attaching an extra trellis portion to a section and decoding said section in parallel with at least one other section.
  6. 6. Apparatus for decoding a section of a trellis representing a received signal sequence, comprising means for modifying an end of the section by attaching an extra trellis portion to it, means for calculating metrics through the section beginning at the modified end, means for calculating trace-back data for the section from the metrics and means for using the trace-back data to decode the section.
    <Desc/Clms Page number 11>
  7. 7. Apparatus for decoding a section of a trellis representing a received signal sequence, comprising means for modifying an end of the section by attaching an extra trellis portion to it, means for calculating first metrics through the section beginning at the modified end and means for calculating second metrics through the section in the opposite direction and means for decoding the section using the first and second metrics.
  8. 8. Apparatus according to claim 6 or 7, wherein the modifying means is arranged to attach a second extra trellis portion to the section before metrics are calculated.
  9. 9. Apparatus for decoding a trellis comprising a first apparatus according to claim 6,7 or 8 and a second apparatus according to claim 6,7 or 8, wherein the first and second apparatus are arranged to operate on trellis sections in parallel.
  10. 10. Apparatus for decoding a trellis in sections comprising means for attaching an extra trellis portion to a section and means for decoding trellis sections in parallel.
  11. 11. A programme for performing a method according to any one of claims 1 to 5.
  12. 12. A method of decoding a trellis, substantially as hereinbefore described with reference to Figure 6 or 7.
  13. 13. Apparatus for decoding a trellis, substantially as hereinbefore described with reference to Figure 6 or 7.
GB0130752A 2001-12-21 2001-12-21 Trellis decoding in parallel where extra trellis sections are appended Withdrawn GB2383506A (en)

Priority Applications (3)

Application Number Priority Date Filing Date Title
GB0130752A GB2383506A (en) 2001-12-21 2001-12-21 Trellis decoding in parallel where extra trellis sections are appended
AU2002356288A AU2002356288A1 (en) 2001-12-21 2002-12-11 Trellis decoding with forward and backward iterations
PCT/GB2002/005622 WO2003056708A1 (en) 2001-12-21 2002-12-11 Trellis decoding with forward and backward iterations

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
GB0130752A GB2383506A (en) 2001-12-21 2001-12-21 Trellis decoding in parallel where extra trellis sections are appended

Publications (2)

Publication Number Publication Date
GB0130752D0 GB0130752D0 (en) 2002-02-06
GB2383506A true GB2383506A (en) 2003-06-25

Family

ID=9928250

Family Applications (1)

Application Number Title Priority Date Filing Date
GB0130752A Withdrawn GB2383506A (en) 2001-12-21 2001-12-21 Trellis decoding in parallel where extra trellis sections are appended

Country Status (3)

Country Link
AU (1) AU2002356288A1 (en)
GB (1) GB2383506A (en)
WO (1) WO2003056708A1 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB2403111A (en) * 2003-06-12 2004-12-22 Arithmatica Ltd Maximum likelihood detector which splits the trellis into sections in time and hierarchically processes it, combining adjacent sections at each level
US7822138B2 (en) 2003-11-04 2010-10-26 Forte Design Systems Limited Calculating apparatus and method for use in a maximum likelihood detector and/or decoder

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN100542050C (en) * 2004-02-06 2009-09-16 中国科学院沈阳自动化研究所 A Design Method with Adaptive and High Speed Turbo Decoder
CN100542053C (en) * 2004-03-03 2009-09-16 中国科学院沈阳自动化研究所 A Design Method with Adaptive and High Speed Viterbi Decoder
US10425108B2 (en) * 2016-06-13 2019-09-24 Qualcomm Incorporated Tailless convolutional codes

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6209116B1 (en) * 1997-10-09 2001-03-27 Hughes Electronics Corporation Adaptable overlays for forward error correction schemes based on trellis codes
US6304612B1 (en) * 1998-05-26 2001-10-16 U.S. Philips Corporation Transmission system having a simplified channel decoder

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5933462A (en) * 1996-11-06 1999-08-03 Qualcomm Incorporated Soft decision output decoder for decoding convolutionally encoded codewords
US6754290B1 (en) * 1999-03-31 2004-06-22 Qualcomm Incorporated Highly parallel map decoder

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6209116B1 (en) * 1997-10-09 2001-03-27 Hughes Electronics Corporation Adaptable overlays for forward error correction schemes based on trellis codes
US6304612B1 (en) * 1998-05-26 2001-10-16 U.S. Philips Corporation Transmission system having a simplified channel decoder

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB2403111A (en) * 2003-06-12 2004-12-22 Arithmatica Ltd Maximum likelihood detector which splits the trellis into sections in time and hierarchically processes it, combining adjacent sections at each level
GB2403111B (en) * 2003-06-12 2005-12-28 Arithmatica Ltd A maximum likelihood detector and/or decoder
US7263652B2 (en) 2003-06-12 2007-08-28 Arithmatica Limited Maximum likelihood detector and/or decoder
US7822138B2 (en) 2003-11-04 2010-10-26 Forte Design Systems Limited Calculating apparatus and method for use in a maximum likelihood detector and/or decoder

Also Published As

Publication number Publication date
AU2002356288A1 (en) 2003-07-15
WO2003056708A1 (en) 2003-07-10
GB0130752D0 (en) 2002-02-06

Similar Documents

Publication Publication Date Title
JP3801211B2 (en) Optimal soft output decoder for tail biting lattice codes
US8122327B2 (en) Symbol-level soft output viterbi algorithm (SOVA) and a simplification on SOVA
US5802116A (en) Soft decision Viterbi decoding with large constraint lengths
US7765459B2 (en) Viterbi decoder and viterbi decoding method
JPH08149018A (en) Error correcting device
EP1122890A2 (en) MAP decoding with parallelized sliding window processing
EP0647036A1 (en) Method and means for detecting partial response waveforms using a modified dynamic programming Heuristic
US6333954B1 (en) High-speed ACS for Viterbi decoder implementations
US6088405A (en) Optimal decoder for tall-biting convolutional codes
JP2000068861A (en) Viterbi decoder and viterbi decoding method
CN101425869A (en) Decoding method and apparatus
KR100336246B1 (en) Integrated circuit with digital processor and co-processor
US7590928B2 (en) Apparatus and method for Viterbi decoding
JP3233847B2 (en) Viterbi decoding method and Viterbi decoding circuit
GB2383506A (en) Trellis decoding in parallel where extra trellis sections are appended
US7266757B1 (en) Pipelined architecture implementing recursion processes for forward error correction
JP2005109771A (en) Maximum posterior probability decoding method and apparatus
US7173985B1 (en) Method and apparatus for implementing a Viterbi decoder
EP1463207A1 (en) Viterbi decoding for convolutional encoded signals with scattering of known bits
WO1995001008A1 (en) Bit error counting method and counter
EP1024603A2 (en) Method and apparatus to increase the speed of Viterbi decoding
JP2001352256A (en) Decoder and decoding method
EP1755228A1 (en) Viterbi decoding apparatus and viterbi decoding method
KR100564757B1 (en) Low Power Viterbi Decoder and Traceback Method
US7032165B2 (en) ACS unit in a decoder

Legal Events

Date Code Title Description
WAP Application withdrawn, taken to be withdrawn or refused ** after publication under section 16(1)