Please use this identifier to cite or link to this item:
http://dspace.cas.upm.edu.ph:8080/xmlui/handle/123456789/435
Title: | Tess2Speech: An Intelligent Character Recognition-To-Speech Application for Android Using Google's Tesseract Optical Character Recognition Engine |
Authors: | Baes, Gregorio B. Custodio, Anter Aaron |
Keywords: | Android training handwritten texts Tesseract optical character recognition computer printed texts intelligent character recognition free and non-proprietary speech-impaired |
Issue Date: | Jun-2016 |
Abstract: | Tes2Speech is an Android mobile application for recognizing handwritten texts and optionally convert it to speech. By training Google’s Tesseract, which is a free open-source Optical Character Recognition (OCR) Engine, to recognize hand- written texts, I created an alternative for Intelligent Character Recognition (ICR) Engine that are proprietary and expensive. OCR Engines’ main purpose is to con- vert scanned documents with computer printed texts into an editable and machine encoded text. But by improving an OCR into ICR to recognize handwritten texts and integrating it into a mobile application in this day and age, it greatly improves its usability. It can help speech-impaired people to communicate, help children to recognize the proper pronunciation of words they write, etc. Thus, creating a free and non-proprietary handwriting-to-speech application that is accessible for everyone, and will be very useful for those who are looking for a free, portable, and reliable alternative to ICR, OCR, and other related expensive communication devices. Tes2Speech includes a user-friendly Tesseract trainer desktop application in order for the user to personalize Tess2Speech for their own handwritings. |
URI: | http://dspace.cas.upm.edu.ph:8080/jspui/handle/123456789/435 |
Appears in Collections: | Computer Science SP |
Files in This Item:
File | Description | Size | Format | |
---|---|---|---|---|
Tess2Speech.pdf | 10.16 MB | Adobe PDF | View/Open |
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.