Complexity Class - RP

## What Is an Error?

In the previous post, we discussed the fact that when we add randomisation, there is a chance that the algorithm outputs true, when it should have outputted false and vice versa.

This is an error. It may be the case that the Turing Machine only gives an error in the acceptance case and in the rejection case, it is always certain.

One Sided Error Example

An example of a one sided error is determining whether a number is prime.

Let say we want to find out whether 20 is prime.

The algorithm outputs YES if our number is prime and NO if it is not.

We generate a random number r and compute 20 /r

  • The first value for r I get is 9
  • I check 20 / 9 and notice that it is not an integer.
  • output : YES

Does this mean that 20 is prime? No, it just means that we may not have found one of it’s factors.

  • The second value for r I get is 10
  • I check 20 / 10 and notice that it is an integer.

This means that for certain 20 is composite.

Where was the error?

The error in this case, was in the YES case. There was a probability that our YES was an error. But in the NO case, we were certain.

Complexity Class - RP - One Sided In the YES

We categorise problems where the error is one-sided in the YES case as the complexity class RP.

Complexity Class - CoRP - One Sided In the No

Similarly we categorise problems where the error is one-sided in the NO case as the complexity class CoRP.

Complexity Class - RP - Formally

Formally we define RP as the class of all languages L where there exists a probabilistic polynomial-time Turing Machine M such that:

$$x \in L \implies Prob_r[M(x) = YES] \geq 1/2$$ $$x \notin L \implies Prob_r[M(x) = NO] = 1$$

Breakdown

Where did the 1/2 come from?

It came from a place where the sun

Without repeating the previous post, this number does not matter. We care about classifying the problem in RP and not the probability. For this reason, you may see $1/3$ in other papers. Moreover, notice that running the algorithm multiple times only increases the confidence that you have in the result. The problem however still stays in RP regardless of how many times, you run it.

If like me, you cannot sleep at night due to the $1/2$, you can replace it with $1 - \gamma$. In the case above, $\gamma = 1/2$ Running the algorithm again makes gamma $1/4$ and our confidence increases to $1 - 1/4 = 3/4$

What does subscript r mean?

Notice the $Prob_r$ . This is important because if the randomness was biased, then our probability would be biased. The $Prob_r$ is to say that when picking the random value, every value has an equal chance of being chosen, ie the distribution is uniform. If you find my explanation lacking, there is a treasure trove of information on uniform distributions.

Summary

Reading the first line in full, it says, if x is in L, then the probability that the Turing machine M outputs YES over the probability r is >= 1/2.

Reading the second line, if x is not in L then the probability that the Turing Machine M outputs NO is 1 (certain).

Complexity Class - CoRP - Formally

$$x \in L \implies Prob_r[M(x) = YES] = 1$$ $$x \notin L \implies Prob_r[M(x) = NO] \geq 1/2$$

Without sounding like the party-pooper. I just switched around the probabilities :)

I won’t breakdown the definition since it is the same as the above. CoRP is known as the complement of RP. Notice that if a language is in RP, then it is not in CoRP, and similarly if a language is in CoRP it is not in RP. In this light, we can define CoRP using RP!