# How hard is NP-hard?

Assuming P${\neq}$ NP, an NP-hard problem cannot be solved in polynomial time. This means that there cannot exist an algorithm that, for all possible inputs, computes the corresponding output in polynomial time. However, NP-harndess doesn’t prohibit the existence of an efficient algorithm for only a subset of the possible inputs. For example, there is a constant time algorithm for any problem that solves it for a constant number of instances. The algorithm just has a lookup table where the outputs of all the constant number of inputs are stored.

But this is an extreme case. Can there exist an algorithm for an NP-hard problem that solves it in polynomial time for a significantly large number of input instances? Or, more precisely, does there exist an algorithm, that, for all ${n}$, solves the problem in polynomial time for ${f(n)}$ inputs of size ${n}$, where ${f(n)}$ is some fast growing function of ${n}$?

How fast growing should ${f(n)}$ be? One interesting choice for ${f(n)}$ is something that makes the average case complexity of the algorithm polynomial, i.e., if inputs are chosen uniformly at random from the set of all inputs, then the algorithm takes polynomial time in expectation. Of course, ${f(n)}$ will have to be fairly large for this to happen.

The interesting fact is that this is possible. For example, if you pick a graph on ${n}$ vertices randomly, with a distribution that is uniform over all graphs with ${n}$ vertices, then there exists an algorithm that decides whether the graph is hamiltonian or not in expected polynomial time.