Skip to main content

An Open Access Journal

Model based traffic congestion detection in optical remote sensing imagery

Abstract

Purpose

A new model based approach for the traffic congestion detection in time series of airborne optical digital camera images is proposed.

Methods

It is based on the estimation of the average vehicle speed on road segments. The method puts various techniques together: the vehicle detection on road segments by change detection between two images with a short time lag, the usage of a priori information such as road data base, vehicle sizes and road parameters and a simple linear traffic model based on the spacing between vehicles.

Results

The estimated speed profiles from experimental data acquired by an airborne optical sensor - 3K camera system - coincide well with the reference measurements.

Conclusions

Experimental results show the great potential of the proposed method for the detection of traffic congestion on highways in along-track scenes.

1 Introduction

During the past years, increasing traffic appears to be one of the major problems in urban and sub-urban areas [1]. Traffic congestion and jams are one of the main reasons for immensely increasing transportation costs due to the wasted time and extra fuel. Conventional stationary ground measurement systems such as inductive loops, radar sensors or terrestrial cameras are able to deliver precise traffic data punctually with high temporal resolution, but their spatial distribution is still limited to selected motorways or main roads.

A new type of additional information is needed for a more efficient use of road networks. Sensors installed on aircrafts or satellites enable data collection on a large scale thus allowing wide-area traffic monitoring [2]. Synthetic aperture radar (SAR) sensors due to their all-weather capabilities seem to be well suited for such type of applications. Ground moving target indication approaches based on the Displaced Phase Center Arrays technique are currently under investigation for airborne SAR sensors [3] and space borne satellites, e.g. TerraSAR-X [4], but still suffer from the low vehicle detection rate, quite often below 30%. Traffic monitoring from optical satellites is still limited due to the not sufficiently high spatial resolution, but the detection of vehicle queues seems to be promising [5]. As it is shown already in [6, 7] airborne optical remote sensing technology has a great potential in traffic monitoring applications. Several airborne optical remote sensing systems are already in experimental use at the German Aerospace Center DLR, e.g. airborne 3K camera system [8], consisting of three digital cameras capable of acquiring three images per second, and LUMOS [9]. Automatic detection of vehicles and estimation of their speed in sequences of optical images is still a challenge. Most known approaches are image based and still result in a too low completeness (e.g. less than 70% [10, 11]) thus being not yet suitable e.g. for the estimating of the traffic density.

In this paper we propose a new model based approach and investigate its potential for the congestion detection in airborne optical remote sensing data. Instead of detecting each individual vehicle and then estimating its speed (microscopic model) as e.g. in [10, 11] we exploit a linear vehicle density-speed relationship for a road segment (macroscopic model) to derive vehicle speed from the estimated vehicle densities in an image.

The paper gives an extensive introduction to the traffic congestion detection approach in section 2 (building on work published in [12, 13]), followed by experiments, discussions and conclusions.

2 Method

Our approach for the traffic congestion detection in sequences of optical images is based on the modeling of a traffic flow on the road segments and thus allows the direct derivation of the required traffic parameters from the data. The proposed traffic congestion detection method is based on the combination of various techniques: change detection, image processing and incorporation of a priori information such as traffic model and road network. The change detection in two images with a short time lag is implemented using the Multivariate alteration detection (MAD) method [14] resulting in a change image where the moving vehicles on the roads are highlighted. Image processing techniques can be applied to derive the vehicle density in the binarized change image. This estimated vehicle density can be related to the vehicle density, acquired by modeling the traffic flow for a road segment. The model is derived from a priori information about the vehicle sizes and road parameters [15], the road network, e.g. road data base [16], and the spacing between the vehicles. The modeled vehicle density is directly related to the average vehicle speed on the road segment [17, 18] and thus the information about the traffic situation, e.g. the existence of congestion, the beginning and end of congestion, the length of congestion, actual travel times, and further parameters can be derived. The flow diagram of the proposed method is shown in Fig. 1.

Fig. 1
figure 1

Flow diagram of the proposed traffic congestion detection method

2.1 A priori information

The following a priori information: road data base (routes in two directions) [16], vehicle sizes for passenger cars and trucks, road parameters (number of lanes, lane width) and solar azimuth and zenith angles is used in the proposed method. The usage of the a priori information is described in the following sections in more details.

2.2 Congested traffic model

The best known early description of the bivariate relationship between traffic density D (number of vehicles per length of lane) and space-mean speed v is a linear density-speed model introduced by Greensfield in [17]. Further enhancement of this model exploits the differentiation of various vehicle groups, e.g. into passenger cars and trucks [18]. Numerous investigations on real traffic data show that under congested conditions the following assumption is true: a class of vehicle’s spacing is a linear function of the speed of all vehicles [18]

(1)

where spacing S i is the front-to-front vehicle distance in meter, B i is a dimensionless parameter of the model, function g(v) transforms speed (km/h) into meters, e.g. g(100 km/h) = 100 m, L i is the vehicle length in meter and i is the vehicle class, e.g. passenger car or truck. Parameter B can be interpreted in the following way: for B = 0.5 and L = 0 the formula (1) means for all drivers a well-known rule of thumb “safe distance = half speedometer reading in metres”. As already mentioned this model well describes a congested traffic. The B value is ranging normally between 0.5 and 1.0 (for more details see Section 3.3).

Now the traffic density can be calculated as [18]

(2)

where , p i is a proportion of vehicle class i and \( \sum\limits_i {{p_i} = 1} \). Thus for density calculation in (2) the weighted mean value of vehicle spacings is used. We propose an alternative and more accurate way for calculating the density by using true vehicle spacings

(3)

Of course one can calculate density for any lane length (e.g. 250 m) by entering this value in formulae (2) and (3) instead of 1 km.

Alternatively to the traffic density D we can define the vehicle density d as a ratio of the area occupied by vehicles to the lane segment area. For this purpose formula (3) is much more convenient to use and the vehicle density d can be written as

(4)

where LW is the lane width and W i is the vehicle width. Note that d is independent on lane length. Thus a vehicle density is a function of a speed: d = f(v) and knowing the density it is possible to invert the model and thus estimate the speed.

How the proposed model is used for the estimation of vehicle density in a real situation see Section 2.4.

2.3 Vehicle detection

Detection of vehicles on the road segments using image sequences is performed in the following way. First, two images with a short time lag usually few seconds (this value is derived from the constraints that a vehicle should not overlap with a previous vehicle and with itself, for more information see Sections 2.4.3 and 2.4.4) are selected, then the region of interest (tube) is defined based on the route middle and the change image is obtained with the MAD algorithm [14]. Finally, the obtained change image—chi squared image of MAD components—is binarized and denoised, e.g. by median filter. For an example of binarized images see the lower images in Fig. 5 (b–e). Now the vehicle density can be estimated for each road segment from the binarized image defined as the ratio of the number of white pixels to the total number of pixels in the road segment.

2.4 Vehicle density estimation

There are some effects influencing the accuracy of vehicle density estimation in time series of images which are explained in more details in the following sections. For the estimation of the model parameter B see the Section 3.3.

2.4.1 Vehicle classes

Classification of all vehicles into two broad vehicle classes: passenger cars and trucks is accounted presently in the proposed approach because of the different sizes, cruising speeds and driver experiences [18]. Proportions of vehicles in the two classes can be estimated empirically from reference data.

2.4.2 Double appearance of vehicles in a binary image

Due to the change detection algorithm used, vehicles blobs are appearing twice in the binarized image, thus the estimated vehicle density should be reduced

(5)

where the correction factor cf = 2. Note that this correction is independent of the vehicle type.

2.4.3 Not overlapping with a previous vehicle

To fulfill the requirement for a particular vehicle to not overlap with a previous vehicle in a binary image the following condition should hold

(6)

where Δt is a time lag between two acquisitions and the speed is given in km/h. So this requirement transforms to the constraint for the time lag dependent only on the model parameter B

(7)

2.4.4 Not overlapping with itself

The second requirement for a particular vehicle to not overlap with itself in a binary image assumes

(8)

what means that Δt should be as large as possible and this contradicts to the previous requirement. Our approach is to fulfill the first requirement and to compensate for the consequences of the second one.

For a given time lag this requirement holds for the following speed

(9)

and it is vehicle type dependent.

For speed not satisfying the Eq. 9 the density is decreased due to vehicle overlapping and thus the correction of the double appearance as in (5) should be adapted by correction factors shown in Fig. 2.

Fig. 2
figure 2

Vehicle density correction factors due to the overlapping with itself effect

2.4.5 Vehicle shadows

Vehicle shadows increase the density dependent on the time of acquisition. The correction factor can be derived after the simple geometric calculations using information about the solar azimuth and zenith angles, road direction at the time of image acquisition and vehicle heights. The correction can be written as

(10)

where i is vehicle type. For off-nadir viewing one should account for the acquisition geometry additionally.

2.4.6 Lane segment length

Lane segment length is an important parameter in the estimation of the vehicle density. From the model side this length should be as short as possible in order to fulfill the assumption about the constant speed on the lane segment. For the estimation it is better to have more samples, thus longer segments. We have found a good compromise as

(11)

where LL is a lane segment length and v max is the maximal speed. For example, for a truck length of 20 m, B = 1 and speed restriction of 120 km/h on a highway it will result in LL of about 280 m.

2.4.7 Number of lanes

For roads with more than one lane the road vehicle density is a sum of separate lane densities for one road direction.

2.4.8 Halting vehicles

Halting vehicles (vehicles have not moved between two acquisitions) or very slow vehicles are not seen in change detection image and thus significantly reduce the vehicle density. Simple solution of the problem could be to increase the time lag between two acquisitions as much as possible still satisfying the condition (7) thus reducing the probability of halting vehicles. More sophisticated solution could be a combination with other methods, e.g. classification of a single image [19].

3 Experiments

To confirm our idea and to validate the method several flight campaigns with the DLR airborne experimental wide angle optical 3K digital camera system operated on a Do-228 aircraft were performed. In this paper one of such experiments is presented. Since the area covered is quite large, the evaluation is performed for many road segments and can therefore be regarded as representative measures.

3.1 DLR 3K camera system

The 3K camera system (“3 Kopf” = “3 head”) consists of three non-metric off-the-shelf cameras (Canon EOS 1Ds Mark II, 16 MPixel). The cameras are arranged in a mount with one camera looking in nadir direction and two in oblique sideward direction, which leads to an increased field of view of max 110°/ 31° in across track/flight direction. The camera system is coupled to a GPS/IMU (Inertial Measurement Unit) navigation system, which enables the direct geo-referencing of the 3K optical images. Figure 3 illustrates the image acquisition geometry of the DLR 3K camera system. Based on the use of 50 mm Canon lenses, the relation between airplane flight height, ground coverage, and pixel size is shown, e.g. the pixel size at a flight height of 1,000 m above ground is 15 cm and the image array covers 2.8 km in width.

Fig. 3
figure 3

The image acquisition geometry for 3K camera system. The tilt angle of the sideward looking cameras is approximately 35°

3.2 Test site and data

The motorway A8 south of Munich is one of the busiest parts of the German motorway network with an average traffic volume of around 100.000 vehicles per day. Test site was a 16 km motorway section between motorway junctions “Hofolding” and “Weyern”. On 2nd Sep. 2006, heavy traffic was expected at this section caused by homebound travelers in the direction of Munich. 3K data were acquired between 14:01 and 15:11 from 2,000 m above ground in three overflights. During each overflight, 22 image bursts were acquired each containing four consecutive images. The time difference within these bursts was 0.7 s, so that each car was monitored at least for 2.1 s. To collect the reference data each lane was manually processed that is all vehicles were detected in the images and their speed measured.

3.3 Estimation of traffic model parameter B

The unknown model parameter B (under assumption of vehicle length of 4.7 m for passenger cars) was analyzed for each lane and direction separately thus resulting in a total evaluation length of 153 km. Due to the time of image acquisition on Saturday afternoon there were very few trucks in this part of the road so their influence was neglected. The estimated values for the traffic model parameter B are plotted in Fig. 4 separately for the right and left lanes for the three lane road. The analysis resulted in an average B = 0.76 (with standard deviation (STD) equal 0.38) for the left lane and the average B = 1.18 (STD = 0.88) for the right lane and total B is approximately equal to 1 (STD = 0.63). From Fig. 4 it can be deduced that a constant value for B can be assumed except for very low speed under 5 km/h (in this case the outliers from our model are most probable but are insignificant because of the usage of other approaches as mentioned already in sect. 2.4.8). Parameter values for the low density traffic must be treated very carefully because our model is designed for the congested traffic. For free flow traffic the values for B can be larger.

Fig. 4
figure 4

Estimated values for traffic model parameter B for left and right lanes separately

3.4 Congestion detection

Results of the traffic congestion detection on the test site are shown in Fig. 5. In this figure, (a) is the original mosaic image with reference speed measurements plotted. It is divided into four parts marked in blue coloured frames. These image frames are displayed from left to right in the following figures (b–e): the original image (upper image) and speed profiles for separate road directions plotted on the change detection image (lower image).

Fig. 5
figure 5

Example of the traffic congestion detection on A8 highway between Munich and Salzburg for 3K sensor data acquired during ADAC flight campaign on 2.9.2006: (a) the original mosaic image with reference speed measurements plotted and (be) corresponding blue coloured frames of the original mosaic image (upper image) and speed profiles for separate routes plotted on the change detection image (lower image)

Traffic congestion is defined usually using the average speed or the traffic density. Unfortunately, there is no unique definition and it is usually country dependent e.g. see [20]. Having the average vehicle speed for each road segment the congestion detection is a trivial task and can be performed by a simple thresholding. For example, if the congestion is defined for speed up to 50 km/h (for motorways), then the red colored areas in Fig. 5 can be interpreted as congested ones.

For the quantitative interpretation speed profiles estimated with our approach are compared graphically to the reference measurements in Fig. 6.

Fig. 6
figure 6

Speed profiles estimated with our approach (blue color) vs the reference measurements (red—left lane, green—right lane)

4 Discussions

First we would like to note some interesting observations when analyzing the curves in Fig. 6. The maximal speed was limited to 120 km/h for free flow traffic, but nevertheless we can see that our model, though optimal for congested conditions, can capture the right speed (see parts of the blue curve from 2,200 m to 2,700 m) because of the presence of some trucks in the images.

The overestimation of the speed in some stages of congestion (parts between 4,000 m–4,300 m and 6,000 m–6,200 m) can be explained by the behavior of the drivers who decelerate just before the congestion and thus increase spacings. The same effect occurs after the passing of congestion when the drivers begin to accelerate (part between 5,400 m and 5,500 m). In latter case, halting vehicles are distorting the estimation additionally.

By setting the speed threshold e.g. to 50 km/h we detect two congested areas (4,300–5,300 and 5,800–6,400). Further we could easily extract the following parameters: the beginning and end of congestion, length of congestion and travel times.

The performance of the proposed method is very dependent on the good quality of the geo-referencing of overlapping images and the quality of the road data base.

A priori information concerning vehicle and road parameters should be adapted very carefully to the regional traffic conditions.

For the accurate vehicle density estimation the time lag between the two image acquisitions should be selected according to the constraints presented in the paper.

Image based methods (microscopic model) perform normally better for a higher resolution (less than 30 cm pixel spacing [10]), thus the aircraft flight height should be low or equivalently one should take into account the reduced image coverage. It seems that the proposed model based method should not be so sensitive to the resolution because it is working on the macroscopic model level.

To overcome the problem of halting vehicles the investigation of more sophisticated solutions, for example a combination with other methods, e.g. classification of a single image, is planned.

Further experiments are planned to test the approach for off-nadir scenes and in the cities during different environmental conditions.

Another research direction is aiming to derive other traffic parameters such as density and flow.

Information derived from remote sensing sensors about the traffic flow can be used for various monitoring applications e.g. as complimentary information in already existing traffic monitoring systems, extracting information in regions with special interests, emergency situations and so on [2].

5 Conclusions

A new traffic congestion detection approach for image time series acquired by the airborne optical 3K camera system is introduced. It allows us to derive one of the main traffic parameters—the average speed—and the vehicle density as an intermediate product. Other parameters such as the beginning and end of congestion, length of congestion and travel times could be derived easily on request. The method is based on the vehicle detection on the road segment by change detection of two images with a short time lag, usage of a priori information and a simple traffic model. Experimental results show the great potential of the proposed method for the detection of traffic congestion on highways in along-track scenes. The estimated speed profiles coincide qualitatively and quantitatively quite well with the reference measurements.

References

  1. Traffic Congestion and Reliability: Trends and Advanced Strategies for Congestion Mitigation, United States Department of Transportation—Federal Highway Administration, September 1, 2005. http://ops.fhwa.dot.gov/congestion_report/ Accessed 8 March 2010

  2. Kurz F, Rosenbaum D, Thomas U, Leitloff J, Palubinskas G, Zeller K, Reinartz P (2009) Near real time airborne monitoring system for disaster and traffic applications. In: Proc. ISPRS Hannover Workshop 2009—High Resolution Earth Imaging for Geospatial Information, Hannover, Germany, June 2–5, 2009

  3. Suchandt S, Palubinskas G, Scheiber R, Meyer F, Runge H, Reinartz P, Horn R (2005) Results from an airborne SAR GMTI experiment supporting TSX traffic processor development. In: Proc. IGARSS, Seoul, Korea, 2005

  4. Meyer F, Hinz S, Müller R, Palubinskas G, Laux C, Runge H (2007) Towards traffic monitoring with TerraSAR-X. Can J Remote Sens 33(1):39–51

    Article  Google Scholar 

  5. Leitloff J, Hinz S, Stilla U (2006) Detection of vehicle queues in Quickbird imagery of city areas. Photogramm Fernerkund Geoinf 4:315–325

    Google Scholar 

  6. Reinartz P, Lachaise M, Schmeer E, Krauß T, Runge H (2006) Traffic monitoring with serial images from airborne cameras. ISPRS J Photogramm Remote Sens 61:149–158

    Article  Google Scholar 

  7. Hinz S, Lenhart D, Leitloff J (2008) Traffic extraction and characterisation from optical remote sensing data. Photogramm Rec 23(124):424–440

    Article  Google Scholar 

  8. Kurz F, Müller R, Stephani M, Reinartz P, Schroeder M (2007) Calibration of a wide-angle digital camera system for near real time scenarios. In: Proc. ISPRS Hannover Workshop 2007—High Resolution Earth Imaging for Geospatial Information, Hannover, Germany, May 29–June 1, 2007

  9. Ernst I, Sujew S, Thiessenhusen KU, Hetscher M, Raßmann S, Ruhé M (2003) LUMOS—airborne traffic monitoring system. In: Proceedings of 6th IEEE International Conference on Intelligent Transportation Systems, Shanghai, China, 2003

  10. Rosenbaum D, Kurz F, Thomas U, Suri S, Reinartz P (2009) Towards automatic near real-time traffic monitoring with an airborne wide angle camera system. Eur Transp Res Rev 1(1):11–21

    Article  Google Scholar 

  11. Kurz F, Charmette B, Suri S, Rosenbaum D, Spangler M, Leonhardt A, Bachleitner M, Stätter R, Reinartz P (2007) Automatic traffic monitoring with an airborne wide-angle digital camera system for estimation of travel times. In: Stilla U, Mayer H, Rottensteiner F, Heipke C, Hinz S (eds) PIA07—Photogrammetric Image Analysis, International Archives of the Photogrammetry, Remote Sensing and Spatial Information Service, PIA07, Munich, Germany, 2007-09-19–2007-09-21

  12. Palubinskas G, Runge H (2008) Detection of traffic congestion in SAR imagery. In: Proc. of European Conference on Synthetic Aperture Radar (EUSAR’2008), June 2–5 2008, Friedrichshafen, Germany, VDE Verlag, Berlin, 4:139–142

  13. Palubinskas G, Kurz F, Reinartz P (2008) Detection of traffic congestion in optical remote sensing imagery. In: Proc. of IEEE International Geoscience and Remote Sensing Symposium (IGARSS’08), July 6–11 2008, Boston, USA, IEEE, 4 pages

  14. Nielsen AA (2007) The regularized iteratively reweighted MAD method for change detection in multi- and hyperspectral data. IEEE Trans Image Process 16(2):463–478

    Article  MathSciNet  Google Scholar 

  15. Fahrsteifenbreiten von Kraftfahrzeugen, Forschungsgesellschaft für Strassen- und Verkehrswesen, Köln. http://www.sicherestrassen.de/VKO/QuerschnitteFahrbahn.htm Accessed 8 March 2010

  16. NAVTEQ data. http://www.navteq.com/about/data.html Accessed 8 March 2010

  17. Greensfield BD (1935) A study of traffic capacity. In: Proc. of Highway Research Board 14:448–477

    Google Scholar 

  18. Kockelman KM (1998) Changes in the flow-density relation due to environmental, vehicle, and driver characteristics. Transp Res Rec 1644:47–56

    Article  Google Scholar 

  19. Palubinskas G, Kurz F, Reinartz P (2009) Traffic congestion parameter estimation in time series of airborne optical remote sensing images. In: Proc. ISPRS Hannover Workshop 2009—High Resolution Earth Imaging for Geospatial Information, Hannover, Germany, 2–5 June, 2009

  20. Staudefinition vom schweizerischen Bundesamt für Strassen. http://www.astra.admin.ch/themen/nationalstrassen/00619/00621/index.html?lang=de Accessed 8 March 2010

Download references

Open Access

This article is distributed under the terms of the Creative Commons Attribution Noncommercial License which permits any noncommercial use, distribution, and reproduction in any medium, provided the original author(s) and source are credited.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Gintautas Palubinskas.

Rights and permissions

Open Access This article is distributed under the terms of the Creative Commons Attribution 2.0 International License (https://creativecommons.org/licenses/by/2.0), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

Reprints and permissions

About this article

Cite this article

Palubinskas, G., Kurz, F. & Reinartz, P. Model based traffic congestion detection in optical remote sensing imagery. Eur. Transp. Res. Rev. 2, 85–92 (2010). https://doi.org/10.1007/s12544-010-0028-z

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s12544-010-0028-z

Keywords