In order to optimize traffic flow through city streets, municipalities often install car-counting cameras or other sensors at a few set locations. A new study, however, suggests that using existing bus-mounted cameras may be a better way to go.
Most city buses are already equipped with forward-facing cameras for the same reason that many private vehicles are – to document any accidents in which they may be involved. Such cameras are in use on the buses utilized at The Ohio State University campus, which is laid out a bit like a small city.
With that fact in mind, an Ohio State team led by Assoc. Prof. Keith Redmill developed an AI-based algorithm for analyzing footage shot by those cameras.
The software uses a deep learning model known as YOLOv4 to detect and track multiple objects in individual frames of video. This capability allows it to determine how many vehicles are present in each shot, how many are moving as opposed to sitting parked at the side of the road, and the speed/trajectory of the ones that are moving.
Utilizing this data – along with existing digital street maps and GNSS satellite data – the algorithm can produce overhead maps of the streets on which the buses are traveling, showing the recorded traffic flow at every point throughout. By contrast, street-mounted cameras can only monitor traffic at a relatively small number of set locations.
“If we collect and process more comprehensive high-resolution spatial information about what’s happening on the roads, then planners could better understand changes in demand, effectively improving efficiency in the broader transportation system,” said Redmill.
A paper on the research was recently published in the journal Sensors.
Source: The Ohio State University
Source of Article