Quick links

Polynomial Bounds for VC Dimension of Sigmoidal and General Pfaffian Neural Networks

Report ID:
TR-526-96
Date:
September 1996
Pages:
16
Download Formats:

Abstract:

We introduce a new method for proving explicit upper bounds on the VC Dimension of
general functional basis networks, and prove as an application, for the first time,
that the VC Dimension of analog neural networks with the sigmoidal activation
function $sigma(y)=1/1+e^{-y}$ is bounded by a quadratic polynomial $O((lm)^2)$
in both the number $l$ of programmable parameters, and the number $m$ of nodes.
The proof method of this paper generalizes to much wider class of Pfaffian
activation functions and formulas, and gives also for the first time polynomial
bounds on their VC Dimension. We present also some other applications of our method.

Follow us: Facebook Twitter Linkedin