Paper
19 August 1993 Differential theory of learning for efficient neural network pattern recognition
John B. Hampshire II, Bhagavatula Vijaya Kumar
Author Affiliations +
Abstract
We describe a new theory of differential learning by which a broad family of pattern classifiers (including many well-known neural network paradigms) can learn stochastic concepts efficiently. We describe the relationship between a classifier's ability to generalize well to unseen test examples and the efficiency of the strategy by which it learns. We list a series of proofs that differential learning is efficient in its information and computational resource requirements, whereas traditional probabilistic learning strategies are not. The proofs are illustrated by a simple example that lends itself to closed-form analysis. We conclude with an optical character recognition task for which three different types of differentially generated classifiers generalize significantly better than their probabilistically generated counterparts.
© (1993) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE). Downloading of the abstract is permitted for personal use only.
John B. Hampshire II and Bhagavatula Vijaya Kumar "Differential theory of learning for efficient neural network pattern recognition", Proc. SPIE 1966, Science of Artificial Neural Networks II, (19 August 1993); https://doi.org/10.1117/12.152617
Lens.org Logo
CITATIONS
Cited by 7 scholarly publications.
Advertisement
Advertisement
RIGHTS & PERMISSIONS
Get copyright permission  Get copyright permission on Copyright Marketplace
KEYWORDS
Error analysis

Silicon

Information operations

Artificial neural networks

Neural networks

Pattern recognition

Optical character recognition

Back to Top