I started this project in 2016, when I was in Year 6 and had been given my first Raspberry Pi.
Based on a project I saw on the Raspberry Pi website, and with a few of my own ideas added in, I created a pair of gloves aiming to help the visually impaired to navigate their surroundings without having to touch anything.
I used fingerless cycle gloves for this project, and the electronics were stored in a Tupperware project-box in a backpack. One glove had a button accessible to the thumb and a sonar module sewn to it, the other housed a small vibration motor in one of the foam pads. At the time, I designed a hat that would have 4 more sonar modules and vibration motors around the outside, so that people could carry things whilst using the system, but I never made it.
All the circuitry was on a breadboard in the Tupperware. I had only just learnt how to solder and wanted to be able to recycle most of the components in later projects. The Pi was running a Python script that fired the ultrasound emitter and recorded the time for it to be detected again. It used a very simple equation (d=v*t/2) to calculate the distance of the reflecting object, and then made the vibration motor vibrate at a different power and frequency depending on how close the object was.
My biggest challenge in this project was the programming. I was new to Python and had only created text-based applications before that didn't use any external components. Whilst I understood the theory behind ultrasonic echolocation quite easily, I struggled to find a way to link the data this outputted to a tangible signal that the vibration motor could play.
I finished the project in Year 7, and submitted it for my school's scholarship project competition. I won, and received a desktop 3D scanner, which I knew I would like to use with a 3D printer once I had saved up enough money to buy one.