Placeholder Text!

Cash recognition for Visually Impaired

January 2, 2018

On January 2, 2018. Developer Sessions, an AI community of AID launched the first public prototype of Cash Recognition for Visually Impaired project. CRVI in short, uses deep learning model in backend and uses smartphone camera as a tool to capture and recognize value of Nepalese monetary notes.

The Nepalese currency does not have any special feature in them, so that the visually impaired can differentiate between the notes of different values. The visually impaired people, unlike us, face difficulties in day to day monetary transactions since they cannot recognize the bank notes as easily as we can in Nepal. As an initiative for the Nepalese visually impaired community, this app will aid them to recognize notes without any hassle. By hovering a smartphone over a note, this app will recognize it and play an audio enabling the user to hear and know the value of the note. The app will be bilingual, with Nepalese and English languages as audio playback options for a better user experience.

Intel Blog Article: https://software.intel.com/en-us/blogs/2017/11/21/cash-recognition-for-the-visually-impaired-using-deep-learning

Second Update on Blog Article: https://software.intel.com/en-us/blogs/2018/12/04/cash-recognition-for-the-visually-impaired

Third/Final Update on Blog Article: https://software.intel.com/en-us/blogs/2019/01/22/cash-recognition-for-visually-impaired-part-3

Intel’s Published Success Story on the project: https://software.intel.com/en-us/articles/success-story-using-ai-to-help-visually-impaired-people-identify-cash

Article on Practical lessons learned while implementing it: https://medium.com/deep-learning-journals/practical-lessons-learned-while-implementing-image-classifier-6dc39c6efd7e

Github Link for final prototype with source code along with Training data and pre-trained model:

https://github.com/kshitizrimal/Cash-Recog-Project

Resources

We value your Feedback

Please reach out to us and let us know what you think about our initiative.