Pen technology has matured enough to date with more improved ink-delivery systems to write more comfortably. Yet, no pen on earth can capture the handwritten characters on paper and convert them to digital format. One commercial product named LiveScribe is available on the market which could capture drawing or writing on 'special papers' to upload the sketch as it is on the cloud. But it lacks conversion of the text in digital format and generalized paper writing support.
'DigiPen' solves the problem efficiently and elegantly. It brings a new concept of smart input device without replacing traditional paper writing. This is made possible with the recent exponential development of the Machine Learning technique called 'Deep Learning'.
A 'DigiPen' is also a traditional pen that writes on any paper surface with one significant improvement -- it's electronic and more importantly, it's smart!! We have two types of 'DigiPen' concepts-
1. 'DigiPen Basic' which works with companion app/software for deep learning with bluetooth communication
2. 'DigiPen Live,' a standalone Wi-Fi enabled system that works without companion app/software and records handwritten data on the cloud/onboard memory in digital text format.
At first glance, the 'DigiPen' looks like regular pens which could be used to write on paper surfaces. But at deeper look, it could be distinguished as an electronic gadget that does a lot more than writing. Improved liquid ink is stored on replaceable ink tube cartridge with removable ballpoint/pinpoint nib. Besides delivering ink, the nib of the pen also acts as a tactile switch to initiate writing capture.
The electronics unit rests on the tail section that contains an on-board microcontroller with 192 node neural network processing engine, a 10DOF IMU sensor module and a bluetooth/wifi communication module. A tiny lithium-polymer battery powers the device. Optional one-line display screen could provide additional user interaction information or character suggestion.
There is a 10DOF Inertial Measurement Sensor on-board to capture handwriting gesture from users while writing on any surface. The sensor data is transmitted to smartphone or PC wirelessly in real-time. The companion app/software then converts the gesture information into digital text format as input to any word-processing application or text-input system. The real trick behind the technology is a state-of-the-art deep learning algorithm for handwritten character classification from inertial measurement data with up-to 99.2% accuracy. Deep Convolutional Neural Network (CNN) makes it possible to recognize handwriting gestures on-board or with companion app from minimal end-user training data.
To be introduced as commercial product, the software should be optimized for minimum user effort for calibration. A user might buy the 'DigiPen' and write the phrase - 'A QUICK BROWN FOX JUMPS OVER THE LAZY DOG' in both capital and small letters to calibrate for his own writing style. Simple yet powerful!
Aside from handwriting recognition, the pen could also take various gesture commands as preset or calibrated. For example, a user might set an interaction like drawing a box after a word phrase will save it as title - and much more!
ABOUT THE ENTRANT
Name: Abu Shuvom
Type of entry: individual
Abu Anas is inspired by:
Recent rapid advancements in the area of machine learning, especially deep learning algorithms have a profound impact on solving various problems with newer approach. Many of the older recognition problems are now possible to solve with larger datasets and reduced data dimensionality . Emerging deep learning algorithms could now outperform human experts in terms of accuracy and reliability. With the power of GPU computing, neural networks could now be trained faster than before. Many of the emerging companies are now working on dedicated neural computing chips specially optimized for deep learning in terms of performance and power consumption. Moreover, with the advancements in deep learning, it is now possible to predict accurate results from entirely different sensor data combination which was previously seemed hard or impossible. All of these hardware and software advancements available in recent time led us to develop and rapidly prototype such complex system with acceptable accuracy.
Software used for this entry:
Patent status: pending