Locally testable code

A locally testable code is a type of error-correcting code for which it can be determined if a string is a word in that code by looking at a small (frequently constant) number of bits of the string. In some situations, it is useful to know if the data is corrupted without decoding all of it so that appropriate action can be taken in response. For example, in communication, if the receiver encounters a corrupted code, it can request the data be re-sent, which could increase the accuracy of said data. Similarly, in data storage, these codes can allow for damaged data to be recovered and rewritten properly.

In contrast, locally decodable codes use a small number of bits of the codeword to probabilistically recover the original information. The fraction of errors determines how likely it is that the decoder correctly recovers the original bit; however, not all locally decodable codes are locally testable.[1]

Clearly, any valid codeword should be accepted as a codeword, but strings that are not codewords could be only one bit off, which would require many (certainly more than a constant number) probes. To account for this, testing failure is only defined if the string is off by at least a set fraction of its bits. This implies words of the code must be longer than the input strings by adding some redundancy.

Definition

To measure the distance between two strings, the Hamming distance is used

\Delta(x,y)=|\{i:x_i \neq y_i\}|

The distance of a string w from a code C:\{0,1\}^k\to\{0,1\}^n is computed by

\Delta(w,C)=\min_x\{w,C(x)\}

Relative distances are computed as a fraction of the number of bits

\delta(x,y)=\Delta(x,y)/n \text{ and } \delta(w,C)=\Delta(w,C)/n

A code C:\{0,1\}^k\to{0,1}^n is called q-local \delta-testable if there exists a Turing machine M given random access to an input w that makes at most q non-adaptive queries of w and satisfies the following:[2]

  • For any x\in\{0,1\}^k and w=C(x), Pr[M^w(1^k)=1]=1. In other words, M accepts given access to any codeword of C.
  • For w\in\{0,1\}^n such that \delta(w,C)>\delta, Pr[M^w(1^k)=1]\leq1/2. M must reject strings \delta-far from C at least half the time.

Limits

It remains an open question whether there are any locally testable codes of linear size, but there are several constructions that are considered "nearly linear":[3]

  1. Polynomial arbitrarily close to linear; for any \epsilon>0, n=k^{1+\epsilon}.
  2. Functions of the form n=k^{1+\epsilon(k)}, where \epsilon(k) is a function tending toward 0. This makes n closer to linear as k increases. For example:
    • 1/\log \log k
    • 1/(\log k)^c for some c\in (0,1)
    • \exp((\log \log \log k)^c)/\log k for c\in (0,1)

These have both been achieved, even with constant query complexity and a binary alphabet, such as with n=k^{1+1/(\log k)^c} for any c\in (0,1). The next nearly linear goal is linear up to a polylogarithmic factor; n=\text{poly}(\log k)*k. Nobody has yet to come up with a linearly testable code that satisfies this constraint.[3]

Parallels

Locally testable codes have a lot in common with probabilistically checkable proofs (PCPs). This should be apparent from the similarities of their construction. In both, we are given q random nonadaptive queries into a large string and if we want to accept, we must with probability 1, and if not, we must accept no more than half the time. The major difference is that PCPs are interested in accepting x if there exists a w so that M^w(x)=1. Locally testable codes, on the other hand, accept w if it is part of the code. Many things can go wrong in assuming a PCP proof encodes a locally testable code. For example, the PCP definition says nothing about invalid proofs, only invalid inputs.

Despite this difference, locally testable codes and PCPs are similar enough that frequently to construct one, a prover will construct the other along the way.[4]

Examples

Hadamard code

One of the most famous error-correcting codes, the Hadamard code is a locally testable code. A codeword x is encoded in the Hadamard code to be the linear function f(y)={\sum_i {x_i y_i}} (mod 2). This requires listing out the result of this function for every possible y, which requires exponentially more bits than its input. To test if a string w is a codeword of the Hadamard code, all we have to do is test if the function it encodes is linear. This means simply checking if w(x)\oplus w(y)=w(x\oplus y) for x and y uniformly random vectors (where \oplus denotes bitwise XOR).

It is easy to see that for any valid encoding w, this equation is true, as that is the definition of a linear function. Somewhat harder, however, is showing that a string that is \delta-far from C will have an upper bound on its error in terms of \delta. One bound is found by the direct approach of approximating the chances of exactly one of the three probes yielding an incorrect result. Let A, B, and C be the events of w(x), w(y), and w(x\oplus y) being incorrect. Let E be the event of exactly one of these occurring. This comes out to

\begin{align}P(E) &\geq P(A\cup B\cup C)-3*P(A\cup B)\\
&\geq 3*P(A)-3*P(A\cup B)-3*P(A\cup B)\\
&\geq 3\delta-6\delta^2\end{align}

This works for 0<\delta\leq5/16, but shortly after, 3\delta-6\delta^2<\delta. With additional work, it can be shown that the error is bounded by

f(x) = \begin{cases}
  3\delta-6\delta^2 & : 0\leq\delta\leq 5/16 \\
  45/128 & : 5/16\leq \delta\leq45/128 \\
  \delta & :45/128 \leq \delta\leq 1/2
\end{cases}

For any given \delta, this only has a constant chance of false positives, so we can simply check a constant number of times to get the probability below 1/2.[3]

Long code

The Long code is another code with very large blowup which is close to locally testable. Given an input 0\leq i\leq2^k (note, this takes k bits to represent), the function that returns the i^{th} bit of the input, f_i(x)=x_i, is evaluated on all possible 2^k-bit inputs 0\leq x\leq2^{2^k}, and the codeword is the concatenation of these (of length n=2^{2^k}). The way to locally test this with some errors is to pick a uniformly random input x and set y=x, but with a small chance of flipping each bit, \mu>0. Accept a function f as a codeword if f(x)=f(y). If f is a codeword, this will accept f as long as x_i was unchanged, which happens with probability 1-\mu. This violates the requirement that codewords are always accepted, but may be good enough for some needs.[5]

Other locally testable codes include Reed-Muller codes (see locally decodable codes for a decoding algorithm), Reed-Solomon codes, and the short code.

References

This article is issued from Wikipedia - version of the Thursday, April 30, 2015. The text is available under the Creative Commons Attribution/Share Alike but additional terms may apply for the media files.