I need a very quick, simple prototype of an application that processes images. The high level scenario is below.
Traffic Light Detection: here a low vision or blind person comes to intersection without the crossing audio beeps/sound to say it is safe to cross. User can point up camera to intersection and using video stream (phone) app will find the traffic light and tell user to hold phone still and then wait for NEXT green light and tell user to walk. (next green light so not telling user to walk at end of a green light).
It should be a very rough prototype that demonstrates the concept, nothing visually fancy:
- Once app is started, need a "Start" button to start capturing video (assuming the user generally points it in the right direction and the traffic light is in the image). If it simplifies things, we can start off with a recorded video and you can assume the camera is static with the traffic light inside the image.
- Detect the traffic light and color.
- Detect the change in traffic light from red to green.
- Upon detection of the event, play a sound or wave file stating "Go".
- Stop button that brings the user back to the screen with the start button.
It should just run in a simulator initially, potentially just with recorded video. After that, we can try on a Nexus S.
Looking for very quick turn-around time (couple of days) of rough prototype. Need the actual code and a 1-2 hour code review to explain what was done.
Skills: video, android-development, prototyping