2.739 Product Design and Development is a graduate product design class wherein teams of 10 students from Rhode Island School of Design (RISD), MIT Sloan School of Management, and MIT undergraduate and graduate schools build a product from idea to alpha prototype through the lens of engineering, design, and business.
My team and I took on a design challenge sponsored by GM that aimed to address safety concerns of texting and driving. After conducting a survey of over 180 participants, we found that 80% of people text and drive regularly. Existing products on the market attempt to prevent texting and driving by restricting data or even access to the driver's phone while driving. An outright solution would be to employ complete autonomous cars, which is an ambition outside the scope of the class.
Our assumption was that restrictive technologies suffer low adoption rates, so our approach was to create a hardware product that could enable safe texting and driving. As project manager and technology lead, I designed and prototyped a novel input device for the steering wheel and a GUI for a heads-up display (HUD) projected on the car windshield.
Our first step was to fully understand the problem space by interviewing drivers and investigating their texting and driving habits.
After over 180 responses, we found that 89% of those who took the survey were between 14 and 30 years old, and 80% of which regularly text and drive.
We also conducted observational studies with some of the survey respondents who did text and drive to identify some of the most dangerous behaviors drivers engage in while texting and driving.
After studying several drivers' texting and driving habits, we found that our product can only enable safe texting and driving if the following needs are met: the driver's eyes are on the road, hands are on the wheel, and the driver can complete whatever they want to do as fast as possible (e.g. text, change the music, take a call).
After ideation, we converged on a product concept we believed could address the user needs outlined above. The product would consist of a HUD and capacitive touch pads on the steering wheel that affords typing and other touch gestures. Tethered to a smartphone, the touch pads can enable interaction while keeping hands on the wheel, and the HUD can display information at a focal distance of 40ft away, keeping the driver's eyes on the road.
The product would also incorporate safety features that already exist on the market, like lane departure alert and imminent crash detection. Access to the touch pads and HUD would be allowed only when it is safe, so during turns or excessive speeds, the HUD turns off and the touch pads don't register any input.
This is a hardware demo of the touch pads and UI in action taken right after I got it to work :) I bought two Adafruit trackpads and removed the enclosure, placing the click button underneath the pads to enable a push-down click like modern MacBook trackpads.
The stock Adafruit Arduino libraries for the trackpads only allowed data read from one trackpad at a time, so I did some heavy modification to allow for simultaneous data read via asynchronous functions.
Then I implemented my WebKit-based UI in a Node-WebKit app so I could access the trackpad data through a serial port (which is not allowed in a normal web app).
In the video, I'm moving my thumbs over the trackpad which move a white cursor over the keys on the screen. When I'm hovering over the key I'd like to type, I press down for a deep click and it registers the key. Tap-to-type was built soon after this video was made, enabling fast typing via muscle memory similar to typing on a touchscreen without looking.
This is a UI mockup displayed over stock driving footage. The idle UI displays driving information like the driver's speed and distance from the next car ahead. The main UI is triggered by a two-thumb swipe up. In this demo, the driver sends a message to Cameron and receives a message back, then closes the main UI to get back to driving.