Title | AN IMPROVED BOUND FOR LEARNING LIPSCHITZ FUNCTIONS |
Publication Type | Journal Article |
Year of Publication | 2006 |
Authors | Cooper, DA |
Volume | 10 |
Issue | 1 |
Start Page | 19 |
Pagination | 10 |
Date Published | 2006 |
ISSN | 1083-2564 |
AMS | 26A16, 68T05 |
Abstract | Revisited here is the problem of learning a nonlinear mapping with uncountable domain and range. The learning model used is that of piecewise linear interpolation on random samples from the domain. More specifically, a network learns a function by approximating its value, typically within some small ǫ, when presented an arbitrary element of the domain. For reliable learning, the network should accurately return the function’s value with high probability, typically higher than${1 − δ }$ for some small ${ δ }$. With new analytic results, we show that, given ǫ and δ and an arbitrary Lipschitz function ${ f : [0, 1]^k → \mathbb{R}, }$ $${ m ≥ (2M √ k)^k · \left( \frac{1}{\epsilon} \right)^k · (k \ ln 2M √ k + k \ ln \frac{1}{\epsilon} + ln \frac{1}{δ}) }$$samples from the uniform distribution on ${ [0, 1]^k }$ are sufficient to reliably learn ${ f}$. The Delaunay triangulation technique is applied in order to keep simplex sizes small. The sufficient condition is an improvement of approximately ${ \left(\frac{2}{3} \right)^k }$ over the previously known upper bound. |
URL | http://www.acadsol.eu/en/articles/10/1/4.pdf |
Refereed Designation | Refereed |
Full Text | REFERENCES |