Cell phone users exhibited a higher percentage of unsafe behaviour than normal pedestrians. In studies from Ohio State University, the authors examined distraction of pedestrians associated with mobile phone use. [1] Pedestrians speaking on the phone, and walking across the street may not be able to see the approaching cars from their phone side. It may lead pedestrians into sever traffic accidents or injuries. So I propose a mobile application that captures the street view by the back camera of the phone and detects cars font view in the images in real time. If there is a coming car approaching to the mobile phone user the application will alert user immediately.
Because of the special features of the car, such as head lights, tires and dark bottom line, I propose to use Adaboost Learning algorithm to perform the car detection.
The great challenge in the project is that the images captured by the mobile phones may not be well aligned. The cars in the image can be of any orientation. To obtain a good accuracy, the images fed to the adaboost classifier need to be aligned by the gravity orientation. Another challenge is that it is hard to collect the large variety of cars by myself on the mobile phone. In this case I will merge two car front view database and adjust the quality of these training images according to my experimental phone, Nexus One.
I will use MIT CBCL car front view database and PASCAL object detection database as the training dataset. I will merge the two database and adjust the quality of the images according to Nexus One's camera quality.