Perplexity is roughly the average number of choices that a model would have to make in guessing the next item in a sequence. The perplexity of a prediction can range from $1$ (100% certainty) to $V$, the vocabulary of the model, when the model has no information.
Perplexity is also the entropy raised to the power of $2$.
$PP(T) = 2^{H(T)}$