Paper
1 September 1993 Probabilistic neural network with reflected kernels
George W. Rogers, Carey E. Priebe, Jeffrey L. Solka
Author Affiliations +
Abstract
Probabilistic neural networks (PNN) build internal density representations based on the kernel or Parzen estimator and use Bayesian decision theory in order to build up arbitrarily complex decision boundaries. As in the classical kernel estimator, the training is performed in a single pass of the data and asymptotic convergence is guaranteed. Asymptotic convergence, while necessary, says little about discrete sample estimation errors. These errors can be quite large. One problem that arises using either the kernel estimator or the PNN is when one or more of the densities being estimated has a discontinuity. This commonly leads to a pdfL(infinity ) expected error on the order of the amount of the discontinuity which can in turn lead to significant classification errors. By using the method of reflected kernels, we have developed a PNN model that does not suffer from this problem. The theory of reflected kernel PNNs, along with their relation to reflected kernel Parzen estimators, is presented along with finite sample examples.
© (1993) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE). Downloading of the abstract is permitted for personal use only.
George W. Rogers, Carey E. Priebe, and Jeffrey L. Solka "Probabilistic neural network with reflected kernels", Proc. SPIE 1962, Adaptive and Learning Systems II, (1 September 1993); https://doi.org/10.1117/12.150591
Lens.org Logo
CITATIONS
Cited by 1 scholarly publication.
Advertisement
Advertisement
RIGHTS & PERMISSIONS
Get copyright permission  Get copyright permission on Copyright Marketplace
KEYWORDS
Error analysis

Process modeling

Neural networks

Statistical analysis

Systems modeling

Resistors

Reverse modeling

Back to Top