Let be the Lebesgue measure in and such that . Let . The discrepancy of is defined by
A low-discrepancy sequence is a sequence in which has a low discrepancy for all .
Most often, , and the sets are restricted to the form . The resulting discrepancy is called star discrepancy.
It is conjectured that for any finite sequence of length the star-discrepancy has a lower bound , with the constant only depending on the dimension of the space. This conjecture has been proved for , but is an open problem for .
It can be proved that the error of integrating a given function using Monte-Carlo integration depends solely on the discrepancy of the point samples. This makes low-discrepancy sequences an important tool in applications requiring Monte-Carlo integration.
The role of low-discrepancy sequences is to replace random numbers in certain tasks. Given any finite sequence of a low-discrepancy sequence, it has the desirable property of not forming clusters of points (contrary to random numbers). That is, it fills a subset (most often a box) with points that distribute evenly. Apart from reducing errors in Monte-Carlo integration, low-discrepancy sequences are useful also in other situations where one needs to cover an area evenly with points.
One should note that a uniform grid of points has the lowest possible star-discrepancy. However, there are several problems with using uniform grids. First, the number of samples to form a grid in higher dimension grows exponentially. For example, if one uses 100 points per side, then in 4 dimensional space one needs to generate 100^4 points. This is one example of the curse of dimensionality. Second, the points can’t be generated incrementally. Third, the number of points must be a product of the grid sides.
Physically Based Rendering: From Theory to Implementation, Matt Pharr, Greg Humphreys, 2004.