Paper
7 March 1996 Model for selectively increasing learning sample number in character recognition
Norihiro Hagita, Minako Sawaki, Ken'ichiro Ishii
Author Affiliations +
Proceedings Volume 2660, Document Recognition III; (1996) https://doi.org/10.1117/12.234705
Event: Electronic Imaging: Science and Technology, 1996, San Jose, CA, United States
Abstract
Increasing the sample size plays an important role in improving recognition accuracy. When it is difficult to collect additional character data written by new writers, distorted characters artificially generated from the original characters by a distortion model can serve as the additional data. This paper proposes a model for selecting those distorted characters that improve recognition accuracy. Binary images are used as a feature vector. In the experiments, recognition based on the k nearest neighbor rule is made for the handwritten zip code database, called IPTP CD-ROM1. Distorted characters are generated using a new model of nonlinear geometrical distortion. New learning samples consisting of the original ones and the distorted ones are generated iteratively. In this model, distortion parameter range is investigated to yield improved recognition accuracy. The results show that the iterative addition of slightly distorted characters improves recognition accuracy.
© (1996) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE). Downloading of the abstract is permitted for personal use only.
Norihiro Hagita, Minako Sawaki, and Ken'ichiro Ishii "Model for selectively increasing learning sample number in character recognition", Proc. SPIE 2660, Document Recognition III, (7 March 1996); https://doi.org/10.1117/12.234705
Lens.org Logo
CITATIONS
Cited by 1 scholarly publication.
Advertisement
Advertisement
RIGHTS & PERMISSIONS
Get copyright permission  Get copyright permission on Copyright Marketplace
KEYWORDS
Distortion

Binary data

Error analysis

Statistical modeling

Data modeling

Databases

Optical character recognition

RELATED CONTENT


Back to Top