Vehicle Counting and Vehicle Speed Measurement Based On Video Processing
Vehicle Counting and Vehicle Speed Measurement Based On Video Processing
ABSTRACT
Vehicle counting system and vehicle speed measurement based on video processing are few of
systems that utilize digital image processing system as a detector of a moving object such as a vehicle to do
the counting and speed measurement. This system is an early stage in the development of Intelligent
Transportation System. The methods used in this system are background subtraction with Gaussian Mixture
Model (GMM) algorithm and blob detection. Background subtraction method is used because it is a
method that can separate foreground and background smoothly and adaptive to the condition of the frame.
The blob detection method provide the coordinates in the form of centroid so it will show the movement of
the vehicle. System trials conducted on three conditions, namely in the morning, afternoon, and evening.
Speed calibration testing parameters obtained with the use of a speed gun. The accuracy of vehicle counting
obtained by the evaluation method system by comparing the real situation with the results of the system.
The value of the vehicle speed measurement uncertainty obtained by using standard deviation calculations
and the combined uncertainty. Vehicle counting accuracy obtained in the morning conditions is 75.69%.
Vehicle counting accuracy obatined in the afternoon is 90.50%. Accuracy of counting vehicles obtained
85.31% in the evening. The value of the uncertainty around those conditions is approximately ± 3 km/h.
Keywords: Intelligent Transportation System, Video Processing, Blob Detection, Background Subtraction,
Speed Gun
233
Journal of Theoretical and Applied Information Technology
20th February 2016. Vol.84. No.2
© 2005 - 2016 JATIT & LLS. All rights reserved.
234
Journal of Theoretical and Applied Information Technology
20th February 2016. Vol.84. No.2
© 2005 - 2016 JATIT & LLS. All rights reserved.
The 60º angle determination is based on research objects (foreground) and background smoothly and
that has been conducted by Li, et al. [7]. According adaptive to the frame conditions. In general, the
to Li, at 60º angle, the measurement result is more steps being taken are as follows:
accurate and maximum. The calibration used in this 1. Lane masking
experiment to detect and measure the vehicle speed Splitting the part of the road where
is in the form of speed gun that was produced by vehicles are moving in one route.
PT. Jasa Marga Persero (Tbk) Branch Semarang, 2. Background elimination
Indonesia. Speed gun is held by a surveyor beside Separating the image of the vehicle from the
the highway so that when a vehicle passes, the road to eliminate image noise that may appear
surveyor will be able to "shoot" a passing vehicle as shadows, rain, etc.
with a speed gun to obtain vehicle speed. 3. Blob dan noise filtration
Making the boundary box (blob) of a moving
3.3 Logic Design Alghorithms
vehicle on the object and removing a small
The design of software used in this system is part
noise to be around the image of the vehicle.
of image processing system. The input of this
4. Countour labelling
system is a video taken with the camera placed
Marking a vehicle that has been detected.
above the highway. In this research, the designs of
5. Vehicle tracking
the software are divided into four parts, namely the
calibration distance, image preprocessing, vehicle First of all, the image obtained will be converted
detection and vehicle counting also vehicle speed into the format of grayscale. This is done because
measurement. the monitoring systems are created using
algorithms that require grayscale format as input.
3.3.1 Calibration distance After that, to eliminate the noise, the image will be
Calibration distance is an important part of the transformed to blur.
measurement process to obtain accurate result.
Distance calibration process carried out in The road being observed shall be referred to the
accordance with the real conditions on the highway region of interest (ROI). At ROI, if there are
when shooting video data. Distance track is unwanted parts to be detected such as sidewalks
bounded by the ROI (Region of Interest) as a and trees, then they will be removed. In order to
regional area of vehicle counting as well as vehicle make the data process more efficient, as well as to
speed measurement. Path length caught on camera avoid other unwanted objecs, the background
is then measured manually by using a meter wheel elimination process is used.
in meters. After that, the next step is the Blob detection is used to detect the shape of the
determination of the pixels length on the system vehicle. All objects that exist in the region of
based on real conditions. interest will be sought by the edge contours of the
3.3.2 Image preprocessing vehicle. Blob detection will detect and count the
The first stage is taking image using the camera. number of vehicles that pass the counting line.
In this process, the vehicle as the main object Figure 2 shows the flowchart of the vehicle
captured by the camera in the form of a frame. counting system and vehicle speed measurement.
Then preprocessing procedure is repeated by
capturing the frame to produce a video. Video
resolution in the system is displayed at the size of
320 x 240 pixels to optimize the image processing
computation. The use of measuring resolution of
320 x 240 pixels is also done due to computer
limitation.
3.3.3 Vehicle detection
The systems use blob detection method and
background subtraction. The use of this method is
based on the results of previous studies that have
good research final results in accuracy. In addition,
blob detection method is used because the method
makes it easier to detect objects in the vehicle
carefully. Background subtraction method is used Figure 2: Flowchart of the Vehicle Counting System and
because it is a method that can separate moving Vehicle Speed Measurement
235
Journal of Theoretical and Applied Information Technology
20th February 2016. Vol.84. No.2
© 2005 - 2016 JATIT & LLS. All rights reserved.
For:
= x coordinates of the end point in the region of
interest (pixels)
= x coordinates of the starting point in the region
of interest (pixels)
= y coordinates of the end point in the region of
interest (pixels)
= y coordinates of the starting point in the
region of interest (pixels)
Once the distance trajectory of the vehicle is
known, then the next process is to determine the
travel time of the detected vehicle. In this system,
the processing time is when a vehicle is detected in
the beginning as a blob and ends at the counting Figure 3: Region of Interest
line of the indicator region of interest.
Figure 3 shows the length of the y-coordinates
3.3.4 Calculation to measure the speed of vehicle that is 140 pixels with the actual distance is ± 20
In this system, the travel time unit meters. The length of the x coordinates in the
measurement used is microseconds, because the system is 190 pixels with the actual distance is ±
displacement time between the frames are very 3.5 meters. Once ROI area is defined, then the
quick. After obtaining the object displacement vehicle counting and measuring vehicle speed
distance of the vehicle and its difference in time followed by the detection of vehicle.
then the image speed value can be found by Figure 3 shows that above the counting line there
Equation (2). The formula for calculating the image is condition B. Condition B means the Region of
speed is: Interest (ROI) before the counting line. As for,
below the counting line, there is a condition A. The
condition A is the Region of Interest (ROI) after
236
Journal of Theoretical and Applied Information Technology
20th February 2016. Vol.84. No.2
© 2005 - 2016 JATIT & LLS. All rights reserved.
passing the counting line. The logic used in this Table 1: Evaluation method based on test data and
experiment is when a vehicle is in the state B and predictions (Powers, 2007)
then passes the counting line, the system will Real Conditions
calculate and measure the speed of vehicles. After
the object passes the counting line and in condition
A, the system stops the vehicle detection as well as
the calculation and measurement of vehicle’s True False
speed. System Results True TP FP
3.5 Calibration Process Using the Speed Gun False FN TN
In this study, we use speed gun as the calibrator
for speed that obtained in our systems. The k factor
is determined by results of the speed gun. Result of the comparison gives four
In the speed measurement calibration used a possibilities, namely:
calibration tool in the form of speed gun owned by
PT. Jasa Marga Persero (Tbk) Semarang Branch. • True Positives (TP) = A vehicle is detected in
The Speed Gun is derived from the speed real conditions and declared as a vehicle on the
calibration company called Laser Technology of systems.
the United States. Sightings of the speed gun, • False Positives (FP) = None of vehicle is
shown in Figure 4. Speed Gun is newly owned by detected in real conditions but declared as a
PT. Jasa Marga Persero (PT) Semarang Branch in vehicle on the systems.
2014, so that the written calibration date is March • True Negatives (TN) = None of vehicle is
23, 2014. The speed gun has a speed accuracy of ± detected in real conditions and declared as not a
2 km/h. vehicle on the systems.
• False Negative (FN) = A vehicle is detected in
real conditions but declared as not a vehicle on
the systems.
Accuracy is the level of prediction that is
closest to the actual label. Higher values of
accuracy indicates better performance. The truth
value of this parameter is calculated from the
results of True Positive (TP) and True Negative
(TN) divided by the combined results of the
evaluation system consisting of True Positive (TP),
True Negative (TN), False Positive (FP), and False
Negative (FN). Evaluation for accuracy is
represented by Formula 3.
(3)
237
Journal of Theoretical and Applied Information Technology
20th February 2016. Vol.84. No.2
© 2005 - 2016 JATIT & LLS. All rights reserved.
(6)
For:
= Uncertainty
= Standard deviation
= Amount of data
238
Journal of Theoretical and Applied Information Technology
20th February 2016. Vol.84. No.2
© 2005 - 2016 JATIT & LLS. All rights reserved.
239
Journal of Theoretical and Applied Information Technology
20th February 2016. Vol.84. No.2
© 2005 - 2016 JATIT & LLS. All rights reserved.
240
Journal of Theoretical and Applied Information Technology
20th February 2016. Vol.84. No.2
© 2005 - 2016 JATIT & LLS. All rights reserved.
241