A new angle on traffic congestion

 


360-degree cams could aid accident response

By Jeff Ristine
UNION-TRIBUNE STAFF WRITER

March 9, 2003

A tractor-trailer overturns on state Route 94 and the 911 calls start.

One witness describes it as a fender-bender, but others speak of a "flaming ball of who-knows-what (and) everything in between," Caltrans spokesman Tom Nipper said in recounting an actual accident.

"With this range of description, whom do you send to the scene?" Nipper said.

It's a question professor Mohan Trivedi of UC San Diego believes someday could be answered quickest by turning to the Internet.

Trivedi and his team at the university's Computer Vision and Robotics Research laboratory are trying to determine whether a network of highway cameras, sending digital images over a high-speed, wireless Internet connection, could work well enough to help coordinate response to traffic emergencies.

A pair of cameras trained on a section of Interstate 5 have been testing the technology for about a year.

Highway cameras are nothing new streaming video has been available on the Internet for more than three years but Trivedi's equipment is more sophisticated, generating a 360-degree image that is processed by a computer to show multiple angles.

The results have been promising enough that the lab was asked to deploy cameras during the Super Bowl in January around Qualcomm Stadium and in the Gaslamp Quarter and Seaport Village.

Law enforcement authorities were able to operate cameras from two security command centers, watching the stadium's riverbed perimeter for intruders.

Trivedi believes the cameras can be used to glean information within moments of an accident, enabling authorities to issue alerts that would help relieve congestion long before law-enforcement or emergency personnel arrive.

UCSD professor Mohan Trivedi, seen through the 360-degree camera.

Authorities could determine how badly a freeway is blocked, for instance, and might be able to tell whether anyone is likely to need medical help.

Trivedi even sees "mobile interactive avatars" robot-like devices, although he shuns the term being dispatched to establish two-way voice or video communications with a doctor or relative of someone who is injured.

Trivedi's research aims to trim some of an estimated $20 billion cost associated with wasted fuel, lost productivity and increased pollution brought about by traffic congestion in California.

"If I can reduce the congestion and even accident clearance times by 10 percent, that's not bad," said Trivedi, director of the computer vision lab. "If it's going to reduce pollution 10 percent, if it's going to make fuel efficiency go up 10 percent, I think those are practical numbers.

"If there's anything that we can do which would make our roads less congested, that means we don't have a need to build more roads."

Trivedi's lab on the ground floor of UCSD's Science and Engineering Research Facility is involved in a wide range of automobile and security-related work, including experiments with the use of tiny cameras to improve the way air bags are deployed and to monitor driver alertness through eye movements and facial expressions.

The key to Trivedi's traffic monitoring is an omnidirectional camera that offers a panoramic view of its surroundings.

Originally developed for use indoors, the camera caught the interest of Caltrans during a presentation in Santa Barbara about four years ago.

A research director "invited us to think about it and propose something . . . in an outdoor domain," Trivedi said.

Big backers

Since then, the lab has spent roughly $900,000 on its activities, with major contributions from Caltrans (about $200,000 in funds and equipment) and the University of California's Digital Media Innovation Program ($600,000), which matches UC researchers with private-industry partners.

The lab has continued to draw support for related research, including homeland security, from other agencies.

As its work got under way, the UCSD team began thinking about cameras as tools to gather information well before the Highway Patrol arrives at an accident scene.

Nipper said Caltrans invested in Trivedi's research as a possible advance on the agency's growing network of cameras, used online and in TV traffic reports mainly to show freeway speeds.

Nipper said as many as half of a typical day's freeway tie-ups have nothing to do with traffic volume or capacity shortfalls. "Nonrecurrent congestion," as it is called, instead stems mainly from accidents, disabled vehicles, road debris, construction, special events and bad weather.

Caltrans today monitors congestion primarily through loop detectors electronic sensors embedded in the freeways.

But Ramez Gerges, a principal engineer with the Caltrans Division of Intelligent Transportation Systems who is familiar with Trivedi's work, said omnidirectional cameras could become an affordable alternative.

Countless views

The omnidirectional camera allows multiple users to draw different views at the same time. Teamed up with a traditional pan/tilt/zoom video camera, one was installed about a year ago along Interstate 5 between Genesee Avenue and Voight Drive.

What the cameras look for "really depends on what you identify as an event," Trivedi said. "Stalled vehicles is a good one. Somebody in the emergency lane is a good one."

Transmitted through a wireless Internet connection, an initially distorted and partly upside-down image is processed and "unwarped" by a computer to resemble a normal picture.

The software tweaks developed in Trivedi's lab and patented by UCSD allow "an infinite number of different perspectives and resolutions of the same site," he said.

A police officer and a rescue worker, for example, could view different camera angles at the same time.

And Trivedi's team imagines the monitoring used in ways that don't require humans. Graduate student Ofer Achler said a computer could count vehicles and then send messages via telephone or pocket PC to alert motorists to a tie-up.

"Nobody has to stay up and watch the thing," Achler said.

For the Super Bowl, planned amid an atmosphere of terrorism jitters, Trivedi's team deployed a working system of cameras with technical assistance from San Diego State and UCSD's Scripps Institution of Oceanography.

In addition to keeping an eye on parts of Qualcomm Stadium, the cameras enhanced surveillance of crowds around Sixth and G streets in the Gaslamp Quarter, a center of party activity, and at a temporary command center in Seaport Village.

Role of robots

Meanwhile, cameras on Friars Road allowed Achler to test visually based software that counts cars and measures vehicle speed. The equipment worked day and night.

Trivedi's research already had captured the attention of homeland security authorities, and the Super Bowl test worked well enough to draw interest from Jacksonville, Fla., host city for the 2005 Super Bowl.

The lab wasn't expected to perform any kind of systematic study demonstrating how traffic efficiency could be improved with cameras.

But Trivedi believes the team already has shown that a multicamera, interactive video network is "the way to go," and that the wireless communications explosion of the past few years has made the notion financially feasible.

The cameras are less prone to breakdown than loop detectors, he said.

But it's up to someone else to decide when and where to deploy Trivedi's cameras.

For now, Caltrans plans to continue investing in conventional cameras and loop detectors. Omnidirectional cameras aren't yet in the agency's long-term plans, although engineer Gerges and Trivedi said they will cost only one-fifth to one-fourth as much.

Trivedi places current costs at $300 to $2,000 per camera, but said prices will come down if they are ordered and produced in large numbers.

While covering every mile of freeway would surely prove financially prohibitive, Trivedi said the cameras might be tried where conditions tend to be worst at the Interstate 5/805 merge, for instance.



Jeff Ristine: (619) 542-4580; jeff.ristine@uniontrib.com