Remotely Processing deep learning data in microcontrollerbased application
authors-Sharik Ali Ansari; Koteswar Rao Jerripothula; Ankush Mittal.

When we create a product for market, the cost of production plays a crucial role in the success of a product. In the market today many microcontrollers are available with different RAM, processing ability, ports available, electricity consumption, durability etc. We choose a microcontroller based on the above. Sometimes the cost of the microcontroller required for projects costs much higher. For example if we want to run only simple programs arduino is best it costs INR 600(Uno). If we want to run multiple programs which become heavy for the microcontroller we use raspberry pi with linux on it as operating system, it costs INR 3000. Raspberry pi works good for all cases but when deep learning comes it stumbles. In deep learning we need GPU cores which it doesn't have. This is solved by microcontrollers with GPU cores like Nvidia jetson nano, Tegra (TX1,TX2). But their cost is significantly higher than Raspberry pi. Nvidia TX1 cost INR 40,000. So we used the raspberry pi to take input/output and programmed it to send all the computational parts to our computer which has a GPU.
How Does it work?
The Raspberry pi is equipped with all the sensors we required for our work. Except the deep learning work all the other work is done by raspberry pi. Using a wireless connection, it sends all the data that needs to be processed to the computer via a client server socket mechanism. The computer runs the deep learning model on the data and sends the processed data's result back to raspberry pi. After receiving the data let's say image, the thread for receiving data gets free and can accept other images. While the image is getting processed by computer. To send back the result another thread is used.
Why this is better?
For us who work in deep learning and don't have the budget to buy GPU added microcontrollers can use the computer/laptop they have for heavy processing. Both Training and afterwhile prediction can be done in a similar manner. Processing at a computer is much faster than microcontroller so it makes work fast. In most applications we power our microcontroller with a battery. More processing on the microcontroller means faster battery drainage. So processing data remotely also extends battery life.
Future Work
Till now we had only tried processing data on model running on remote computer. Training a model remotely needs to be experimented. Already google colab like platforms offer remote processing. We can connect our raspberry pi with google colab and use its GPU (Quadro P100) to process our data. Also data can be parallely divided to multiple GPUs.