EDCircles A Real-Time Circle Detector With A False Detection Control
EDCircles A Real-Time Circle Detector With A False Detection Control
Pattern Recognition
journal homepage: www.elsevier.com/locate/pr
a r t i c l e i n f o abstract
Article history: We propose a real-time, parameter-free circle detection algorithm that has high detection rates, produces
Received 9 April 2012 accurate results and controls the number of false circle detections. The algorithm makes use of the
Received in revised form contiguous (connected) set of edge segments produced by our parameter-free edge segment detector, the
21 September 2012
Edge Drawing Parameter Free (EDPF) algorithm; hence the name EDCircles. The proposed algorithm first
Accepted 26 September 2012
Available online 3 October 2012
computes the edge segments in a given image using EDPF, which are then converted into line segments.
The detected line segments are converted into circular arcs, which are joined together using two heuristic
Keywords: algorithms to detect candidate circles and near-circular ellipses. The candidates are finally validated by an a
Circle detection contrario validation step due to the Helmholtz principle, which eliminates false detections leaving only
Ellipse detection
valid circles and near-circular ellipses. We show through experimentation that EDCircles works real-time
Real-time image processing
(10–20 ms for 640 480 images), has high detection rates, produces accurate results, and is very suitable
Helmholtz Principle
NFA for the next generation real-time vision applications including automatic inspection of manufactured
products, eye pupil detection, circular traffic sign detection, etc.
& 2012 Elsevier Ltd. All rights reserved.
0031-3203/$ - see front matter & 2012 Elsevier Ltd. All rights reserved.
http://dx.doi.org/10.1016/j.patcog.2012.09.020
726 C. Akinlar, C. Topal / Pattern Recognition 46 (2013) 725–740
circles. Dasgupta et al. [31–33] developed a swarm intelligence EDCircles algorithm is presented in Algorithm 1 and we will
technique named adaptive bacterial foraging optimization (ABFO) describe each step of EDCircles in detail in the following sections.
for circle detection. Their algorithm produces good results but is
sensitive to noise. Cuevas et al. use discrete differential evolution Algorithm 1. Steps of EDCircles algorithm.
(DDE) optimization [34], harmony search optimization (HSA) [35] 1. Detect edge segments by EDPF and extract complete circles
and an artificial immune system optimization technique named and ellipses.
Clonal Selection Algorithm (CSA) [36] for circle detection. Although 2. Convert the remaining edge segments into line segments.
these evolutionary computation techniques have good detection 3. Detect arcs by combining line segments.
rates and accurate results, they usually require multiple runs to 4. Join arcs to detect circle candidates.
detect multiple circles, and are quite slow to be suitable for real- 5. Join the remaining arcs to detect near-circular ellipse
time applications. Just like RCD, these algorithms work on an edge candidates.
map pre-computed by a traditional edge detection algorithm with 6. Validate the candidate circles/ellipses using the Helmholtz
many parameters. principle.
Frosio et al. [37] propose a real-time circle detection algorithm 7. Output the remaining valid circles/ellipses.
based on maximum likelihood. Their method is fast and can
detect partially occluded circular objects, but requires that the
radius of the circles to be detected be predefined, which greatly 2.1. Edge segment detection by edge drawing parameter free (EDPF)
limits its applications. Wu et al. [41] present a circle detection
algorithm that runs 7 frames/s on 640 480 images. The authors Given an image, the first step of EDCircles is the detection of
claim to achieve high success rate, but there is not much experi- the edge segments in the image. To achieve this, we employ
mental validation to back their claims. Zhang et al. [38] propose our recently proposed, real-time edge/edge segment detector,
an ellipse detection algorithm that can be used for real-time face edge drawing (ED) [48–51]. Unlike traditional edge detectors,
detection. Liu et al. [39] present an ellipse detector for noisy images e.g., Canny [17], which work by identifying a set of potential edge
and Prasad et al. [40] present an ellipse detector using the edge pixels in an image and eliminating non-edge pixels through
curvature and convexity information. While both algorithms produce operations such as non-maximal suppression, hysteresis thresh-
good results, they are slow and not suitable for real-time applications. olding, erosion, etc., ED follows a proactive approach and works
Vizireanu et al. [42–44] make use of mathematical morphol- by first identifying a set of points in the image, called the anchors,
ogy for shape decomposition of an image and use the morpholo- and then joins these anchors using a smart routing procedure;
gical shape decomposition representation of the image for that is, ED literally draws edges in an image. ED outputs not only a
recognition of different shapes and patterns in the image. While binary edge map similar to those output by traditional edge
their algorithms are good for the detection of general shapes in an detectors, but it also outputs the result as a set of edge segments
image, they are not suitable for real-time applications. each of which is a contiguous (connected) pixel chain [49].
Desolneux et al. [60] is the first to talk about the a contrario ED has many parameters that must be set by the user, which
circular arc detection. Recently, Patraucean et al. [45,46] propose requires the tuning of ED’s parameters for different types of images.
a parameter-free ellipse detection algorithm based on the a Ideally, one would want to have a real-time edge/edge segment
contrario framework of Desolneux et al. [58]. The authors extend detector which runs with a fixed set of internal parameters for all
the line segment detector (LSD) by Grompone von Gioi et al. [63] types of images and requires no parameter tuning. To achieve this
to detect circular and elliptic arcs in a given image without goal, we have recently incorporated ED with the a contrario edge
requiring any parameters, while controlling the number of false validation mechanism due to the Helmholtz principle [58–60], and
detections by the Helmholtz principle [58]. They then use the obtained a real-time parameter-free edge segment detector, which
proposed algorithm (named ELSD [46]) for the detection and we name edge drawing parameter free (EDPF) [52,53]. EDPF works
identification of Bubble Tags [47]. by running ED with all ED’s parameters at their extremes, which
In this paper, we present a real-time (10–20 ms on 640 480 detects all possible edge segments in a given image with many false
images), parameter-free circle detection algorithm that has high positives. We then validate the extracted edge segments by the
detection rates, produces accurate results, and has an a contrario Helmholtz principle, which eliminates false detections leaving only
validation step due to the Helmholtz principle that lets it control perceptually meaningful edge segments with respect to the a contra-
the number of false detections. The proposed algorithm makes rio approach.
use of the contiguous (connected) set of edge segments produced Fig. 1(a) shows a 424 436 grayscale synthetic image contain-
by our parameter-free edge segment detector, the edge drawing ing a big circle obstructed by four rectangular blocks, a small
parameter free (EDPF) [48–53]; hence the name EDCircles [54,55]. ellipse obstructed by three rectangular blocks, a small circle, an
Given an input image, EDCircles first computes the edge segments of ellipse and an arbitrary polygon-like object. When this image is
the image using EDPF. Next, the resulting edge segments are turned fed into EDPF, the edge segments shown in Fig. 1(b) are produced.
into line segments using our line segment detector, EDLines [56,57]. Each color in the edge map represents a different edge segment,
Computed lines are then converted into arcs, which are combined each of which is a contiguous chain of pixels. For this image, EDPF
together using two heuristic algorithms to generate many candidate outputs 15 edge segments in just 3.7 ms in a PC with 2.2 GHz Intel
circles and near-circular ellipses. Finally, the candidates are vali- 2670QM CPU. Notice the high quality nature of the edge map with
dated by the Helmholtz principle [58–63], which eliminates false all details clearly visible.
detections leaving only valid circles and near-circular ellipses. Each edge segment traces the boundary of one or more objects in
the figure. While the boundary of an object may be traced by a single
edge segment, as the small circle, the ellipse and the polygonal object
2. The proposed algorithm: EDCircles are in Fig. 1(b), it is also possible that an object’s boundary be traced
by many different edge segments. This is the case for the big circle as
EDCircles follows several steps to compute the circles in a the circle’s boundary is traced by four different edge segments, and
given image. The general idea is to extract line segments in an the small obstructed ellipse, which is traced by three different edge
image, convert them into circular arcs and then combine these segments. The result totally depends on the structure of the objects,
arcs to detect circles and near-circular ellipses. General outline of the amount of obstruction and noise in the image. That is, there is no
C. Akinlar, C. Topal / Pattern Recognition 46 (2013) 725–740 727
Fig. 1. (a) A sample image (424 436). (b) Edge segments (a contiguous chain of pixels) extracted by EDPF. Each color represents a different edge segment. EDPF outputs
15 edge segments in 3.7 milliseconds (ms). (c) Lines approximating the edge segments. A total of 98 lines are extracted. (For interpretation of the references to color in this
figure caption, the reader is referred to the web version of this article.)
way to tell beforehand how the edge segments will trace the boun- and extend the line for as long as the root mean square error is
daries of the objects in a given image. smaller than a certain threshold, i.e., 1 pixel error. Refer to EDLines
Notice from Fig. 1 that in some cases, e.g., the small circle, the [56,57] for the details of line segment extraction, where we validate
ellipse and the polygon, the entire boundary of an object in the the lines after detection using the Helmholtz principle to eliminate
image is returned as a closed curve; that is, the edge segment invalid detections. In EDCircles, though, we do not validate the
starts at a pixel on the boundary of an object, traces its entire lines after detection. The reason for this decision comes from our
boundary and ends at where it starts. In other words, the first and observation that the line segment validation algorithm due to the
last pixels of the edge segment are neighbors of each other. It is Helmholtz principle usually eliminates many short lines, which may
highly likely that such a closed edge segment traces the boundary be valuable for the detection of small circles in an image. So, unlike
of a circle, an ellipse or a polygonal shape as is the case in Fig. 1. EDLines, we do not eliminate any detected lines and use all detected
So as the first step after the detection of the edge segments, we go lines for further processing and detection of arcs.
over all edge segments, take the closed ones and see if the closed Fig. 1(c) shows the lines extracted from the image shown
edge segment traces the entire boundary of a circle or an ellipse. in Fig. 1(a). Clearly, circular objects are approximated by a set of
Processing of a closed edge segment follows a very simple consecutive lines. In the next section, we describe how these lines can
idea: We first fit a circle to the entire list of pixels in the edge be converted into circular arcs by processing of consecutive lines.
segment using the least squares circle fit algorithm [64] and
compute the root mean square error. If the circle fit error, i.e., the
root mean square error, is smaller than some threshold (fixed at 2.3. Circular arc detection
1.5 pixels for the proposed algorithm), then we add the circle to
the list of circle candidates. Just because the circle fit error is We define a circular arc to be a set of at least three consecutive
small does not mean that the edge segment is an actual circle; it is lines that turn in the same direction. Using this definition, we
just a candidate yet and needs to go through circle validation by detect a circular arc as follows: Given a list of lines making up an
the Helmholtz principle to be returned as a real circle. Section 2.6 edge segment, simply walk over the lines and compute the angle
describes the details of circle validation. between consecutive lines and the direction of turn from one line
If the circle fit fails, then we try fitting an ellipse to the pixels of to the next. If at least three lines turn in the same direction and
the edge segment. We use the ellipse fit algorithm described in [65], the angle between the lines is in-between certain thresholds, then
which returns an ellipse equation of the form Ax2 þ Bxyþ Cy2 þ Dx þ these lines may form a circular arc.
Ey þ F ¼ 0. If the ellipse fit error, i.e., the root mean square error, is Fig. 2 illustrates a hypothetical edge segment being approxi-
smaller than a certain threshold (fixed at 1.5 pixels for the proposed mated by 18 consecutive line segments, labeled l1 through l18. To
algorithm), then we add the ellipse to the list of ellipse candidates, compute the angle between two consecutive lines, we simply
which also needs to go through validation by the Helmholtz principle threat each line as a vector and compute the vector dot product.
before being returned as a real ellipse. Similarly, to compute the turn of direction from one line to the
If the edge segment is accepted either as a circle or an ellipse next, we simply compute the vector cross product and use the
candidate, it is removed from the list of edge segments and is not sign of the result as the turn direction.
processed any further. Otherwise, the edge segment is used in Fig. 3(a) illustrates the approximation of the blue right vertical
further processing along with other non-closed edge segments. edge segment in Fig. 1(b) by 11 consecutive line segments,
labeled v1 through v11. Fig. 3(b) shows the details of the 11
2.2. Conversion of edge segments into line segments lines: their lengths, the angle between consecutive lines and the
direction of the turn going from one line to the next, where a ‘ þ’
After the removal of the closed edge segments, which are denotes a left turn, and ‘ ’ denotes a right turn.
taken as circle or ellipse candidates, the remaining edge segments Our arc detection algorithm is based on the following idea: For
are converted into line segments (lines for short in the rest of the a set of lines to be a potential arc candidate, they all must have
paper). The motivation for this step comes from the observation the same turn direction (to the left or to the right) and the angle
that any circular shape is approximated by a consecutive set of between consecutive lines must be in-between certain thresh-
lines (as seen in Fig. 1(c)), and these lines can easily be turned into olds. If the angle is too small, we assume that the lines are
circular arcs by a simple post-processing step as described in the collinear so they cannot be part of an arc; if the angle is too big,
next section. we assume that the lines are part of a strictly turning object such
Conversion of an edge segment into a set of lines follows the as a square, a rectangle, etc. For the purposes of our current
algorithm given in our line detector, EDLines [56,57]. The idea is to implementation, we fix the low angle threshold to 61, and the
start with a short line that satisfies a certain straightness criterion, high angle threshold to 601. These values have been obtained by
728 C. Akinlar, C. Topal / Pattern Recognition 46 (2013) 725–740
Fig. 2. (a) A hypothetical edge segment being approximated by 18 consecutive line segments labeled l1 through l18. (b) The angle yi between vi and vi þ 1 are illustrated and
colored with red, green or blue. If the angle is bigger than a high threshold, e.g., y1 , y2 and y9 (colored red), or if the angle is smaller than a low threshold, e.g., y16 (also
colored red), then these lines cannot be part of an arc. Otherwise, if three or more consecutive lines turn to the left, e.g., lines v3 through v7 (angles colored blue), then these
lines may form an arc. Similarly, if three or more consecutive lines turn to the right, e.g., lines v10 through v16 (angles colored green), then these lines may form an arc. (For
interpretation of the references to color in this figure caption, the reader is referred to the web version of this article.)
for a set of at least three consecutive lines which all turn in the
same direction and the turn angle from one line to the next is in-
between the low and high angle thresholds. In Fig. 2, lines v3
through v7 satisfy our criterion and is a potential arc candidate.
Similarly, lines v10 through v16 make up for another arc candidate.
Given a set of at least three lines that satisfy our arc candidate
constraints, we first try fitting a circle to all pixels making up the
lines using the circle fit algorithm in [64]. If the circle fit succeeds,
i.e., if the root mean square error is less than 1.5 pixels, then the
extracted arc is simply added to the list of arcs, and we are done.
Otherwise, we start with a short arc consisting of only of three
lines and extend it line-by-line by fitting a new circle [64] until
the root mean square error exceeds 1.5 pixels. At this point, the
detected arc is added to the list of arcs, and we continue processing
the rest of the lines to detect more circular arcs. Using this algorithm,
we detect two arcs in Fig. 2: Lines v3 through v7 form arc A1 with
Fig. 3. (a) An illustration of the blue right vertical segment in Fig. 1(b) being center ðxcA1 , ycA1 Þ and radius rA1. Similarly, lines v10 through v16 form
approximated by 11 consecutive line segments labeled v1 through v11. The angle arc A2 with center ðxcA2 , ycA2 Þ and radius rA2. In a complex image
between line segments v1 and v2 (y1 ), v3 and v4 ðy3 Þ, v7 and v8 ðy7 Þ, and v10 and v11
consisting of many edge segments, we will have hundreds of arcs.
(y10 ) are also illustrated. (b) Lines making up the blue right vertical segment in
Fig. 1(b). Fig. 4 shows the arcs computed from the lines of Fig. 1(c), and
Table 1 gives the details of these arcs. An arc spans a part between
(StartAngle, EndAngle) of the great circle specified by (Center X,
experimentation on a variety of images containing various circular Center Y, Radius). The arc is assumed to move counter-clockwise
objects. from StartAngle to EndAngle over the great circle. As an example,
The bottom part of Fig. 2 depicts the angles between consecutive A2 covers a total of 611 from 911 to 1521 of the great circle with
lines of the edge segment shown at the top of Fig. 2, and the turn of center coordinates (210.6, 211.3) and radius ¼182.6.
direction from one line to the next. The angles smaller than the
low angle threshold or bigger than the high angle threshold, e.g., y1 , 2.4. Candidate circle detection by arc join
y2 , y9 , and y16 , have been colored red; all other angles have been
colored either blue or green depending on the turn of direction. After the computation of the arcs, the next step is to join the
Specifically, if the next line turns to the left, the angle has been arcs into circle candidates. To do this, we first sort all arcs with
colored blue, and if the next line turns to the right, then the angle respect to their length in descending order, and start extending
has been colored green. the longest arc first. The motivation for this decision comes from
Having computed the angles and the turn of direction infor- the observation that the longest arc is the closest to a full circle, so
mation, we simply walk over the lines of an edge segment looking it must be extended and completed into a full circle before the
C. Akinlar, C. Topal / Pattern Recognition 46 (2013) 725–740 729
Fig. 4. (a) Arcs computed from the lines of Fig. 1(c). (b) Candidate circles and ellipses before validation (overlayed on top of the image with red color). (For interpretation of
the references to color in this figure caption, the reader is referred to the web version of this article.)
Table 1 all arcs are very close to each other, so they all satisfy the radius
Details of the arcs shown in Fig. 4(a). An arc spans a part between (StartAngle, difference constraint. As for the center distance constraint, only A2
EndAngle) of a great circle specified by (center X, center Y, Radius). The arc moves
and A4 satisfy it, while A3’s center falls out of the center distance
counter-clockwise from StartAngle to EndAngle over the circle.
threshold rT. So in Fig. 5(b), only arcs A2 and A4 would be selected as
Arc Center X Center Y Radius Start angle End angle candidates for joining with A1.
(deg.) (deg.) After the computation of the candidate arcs, the next step is to
combine them one-by-one with the extended arc A1 by fitting a
A1 210.3 211.8 182.2 325 86
A2 210.6 211.3 182.6 91 152
new circle to the pixels making up both of the arcs. Instead of
A3 212.2 215.9 178.5 275 312 trying the join in random order, we start with the arc whose
A4 210.7 211.6 183.0 173 264 either end-point is the closest to either end-point of A1. The
A5 111.1 267.5 52.3 275 312 motivation for this decision comes from the observation that if
A6 120.1 291.4 34.9 141 219
there is more than one arc that is part of the same great circle, it is
A7 139.4 288.6 49.2 94 143
better to start the join with the arc closest to the extended arc A1.
In Fig. 5(a) for example, we would first join A1 with A4 and then
other arcs would. During the extension of an arc, the idea is to with A3. Similarly, in Fig. 5(b) we would first join A1 with A2 and
look for arcs having similar radii and close centers, and collect a then A4.
list of candidate arcs that may be combined with the current arc. After an arc A1 is extended with other arcs on the same great
Given an arc A1 to extend into a full circle, we go over all detected circle, we decide at the last step whether to make the extended
arcs and generate a set of candidate arcs that may be joined with A1. arc a circle candidate. Here, we take the view that if an arc spans
We have two criterions for arc join: (1) Radius difference constraint: at least 50% of the circumference of its great circle, then we make
The radius difference between A1 and the candidate arc A2 must be the arc a circle candidate. Otherwise, the arc is left for near-
within some threshold. Specifically, if A2’s radius is within 25% of A1’s circular ellipse detection. In Fig. 5(a) for example, when A1 is
radius, then A2 is taken as a candidate for join; otherwise A2 cannot be joined with A4 and A3, the extended arc would span more than
joined with A1. As an example, if A1’s radius is 100, then all arcs 50% of the circumference of its great circle. So the extended arc
whose radii are between 75 and 125 would be taken as candidates for would be made a circle candidate. In Fig. 5(c) however, when A1,
arc join. (2) Center distance constraint: The distance between the A2 and A3 are joined together, we observe that the extended arc
center of A1 and the center of the candidate arc A2 must be within does not span at least 50% of the circumference of its great circle,
some threshold. Specifically, we require that the distance between i.e., y1 þ y2 þ y3 o p; so the extended arc is not taken as a circle
the centers of A1 and A2 must not exceed 25% of A1’s radius. As an candidate. Computation of the total arc span is performed by
example, if A1’s radius is 100, then all arcs whose centers are within simply looking at the ratio of the total number of pixels making
25 pixels of A1’s center would be taken as candidates for arc join up the joined arcs to the circumference of the newly fitted circle.
assuming they also satisfy the radius difference constraint. If this ratio is greater than 50%, then the extended arc is taken as a
Fig. 5 illustrates possible scenarios during arc join for circle circle candidate.
detection. In Fig. 5(a), we illustrate a case where all potential arc To exemplify the ideas presented above, here is how the seven
candidates satisfy the center distance constraint, but one fails the arcs depicted in Fig. 4(a) and detailed in Table 1 would be processed:
radius difference constraint. Here, A1 is the arc to be extended with we first take A1, the longest arc, as the arc to be extended, with A2,
A2, A3 and A4 as potential candidates for arc join. As illustrated, the A3, A4, A5, A6 and A7 as the remaining arcs. Since the radii of A2, A3
centers of all arcs are very close to each other; that is, the distance of and A4 are within 25% of A1’s radius and their center distances are
the centers of A2, A3 and A4 from the center of A1 are all within the within the center distance threshold, only these three arcs would be
center distance threshold rT. As for the radius difference constraint, taken as candidates for join. We next join A1 and A2 since A2’s end-
only A3 and A4 satisfy it, while A2’s radius falls out of the radius point is closest to A1 (refer to Fig. 4(a)). After A1 and A2 are joined,
difference range. So in Fig. 5(a), only arcs A3 and A4 would be the extended arc would now be joined with A3 since A3’s end-point
selected as candidates for joining with A1. would now be closest to the extended arc. Finally, A4 would be
In Fig. 5(b), we illustrate a case where all potential arc candidates joined. Since the final extended arc covers more than 50% of its great
satisfy the radius difference constraint, but one fails the center circle, it is taken as a circle candidate. Continuing similarly, the next
distance constraint. Here, A1 is the arc to be extended with A2, A3 longest remaining arc is A5, so we try extending A5 with A6 and A7
and A4 as potential candidates for arc join. As illustrated, the radii of being the only remaining arcs in our list of arcs. The only candidate
730 C. Akinlar, C. Topal / Pattern Recognition 46 (2013) 725–740
Fig. 5. Illustrations of possible scenarios during arc join for circle detection. (a) A1 is the arc to be extended with A2, A3 and A4 as potential candidates for arc join. All arcs
satisfy the center distance constraint, but A2 fails the radius difference constraint. So, only A3 and A4 are selected as candidates for joining with A1. (b) A1 is the arc to be
extended with A2, A3 and A4 as potential candidates for arc join. All arcs satisfy the radius difference constraint, but A3 fails the center distance constraint. So, only A2 and A4
are selected as candidates for joining with A1. (c) A1, A2 and A3 all satisfy the radius difference and center distance constraints and are joined into a bigger arc. But, since the
total length of the extended arc does not span at least 50% of the circumference of its great circle, i.e., y1 þ y2 þ y3 o p, it is not taken as a circle candidate.
Fig. 6. Illustrations of possible scenarios during arc join for ellipse detection. (a) A1 is the arc to be extended with A2, A3, A4 and A5 as potential candidates for arc join. All
arcs satisfy the center distance constraint, but A2 and A5 fail the radius difference constraint. So, only A3 and A4 are selected as candidates for joining with A1. (b) A1 is the
arc to be extended with A2, A3, A4 and A5 as potential candidates for arc join. All arcs satisfy the radius difference constraint, but A3 and A5 fail the center distance constraint.
So, only A2 and A4 are selected as candidates for joining with A1. (c) A1, A2 and A3 all satisfy the radius difference and center distance constraints and are joined into a bigger
elliptic arc. But, since the total length of the extended arc does not span at least 50% of the circumference of its great ellipse, i.e., y1 þ y2 þ y3 o p, it is not taken as an ellipse
candidate.
arc that can be joined with A5 is A7 since A6’s radius is not within the centers of A1 and A2 must not exceed 50% of A1’s radius. As an
25% of A5’s radius. When we try to join A5 and A7 by fitting a circle, example, if A1’s radius is 100, then all arcs whose centers are within
the circle fit error turns out to be bigger than the error threshold of 50 pixels of A1’s center would be taken as candidates for arc join
1.5 pixels, so A5 and A7 are not joined together. Notice that A5, A6 and assuming they also satisfy the radius difference constraint.
A7 are part of a great ellipse rather than a great circle, so attempts to Fig. 6 illustrates possible scenarios during arc join for ellipse
join them into a great circle fails. Clearly, these arcs need to be detection. In Fig. 6(a), we illustrate a case where all potential arc
joined into an ellipse rather than a circle. candidates satisfy the center distance constraint, but two fail the
radius difference constraint. Here, A1 is the arc to be extended with
2.5. Candidate near-circular ellipse detection by arc join A2, A3, A4 and A5 as potential candidates for arc join. As illustrated,
the centers of all arcs are very close to each other; that is, the
The algorithm presented in the previous section deals with the distance of the centers of A2, A3, A4 and A5 from the center of A1
detection of perfect circle candidates. But this algorithm would are all within the center distance threshold rT. As for the radius
not detect near-circular ellipses. It is known that a circle appears difference constraint, only A3 and A4 satisfy it, while A2’s and A5’s
slightly elliptical depending on the viewpoint of the camera. So it radii fall out of the radius difference range. So in Fig. 6(a), only arcs
is important that the circle detection algorithm detects these A3 and A4 would be selected as candidates for joining with A1.
imperfect circular objects in an image. In Fig. 6(b), we illustrate a case where all potential arc candidates
The arc join for ellipse detection is very similar to the arc join satisfy the radius difference constraint, but two fail the center
for perfect circle detection presented in Section 2.4. Given an arc distance constraint. Here, A1 is the arc to be extended with A2, A3,
A1 to extend into an ellipse, we go over the remaining arcs and A4 and A5 as potential candidates for arc join. As illustrated, the radii
generate a set of candidate arcs that may be joined with A1 into of all arcs are very close to each other, so they all satisfy the radius
an ellipse. We employ the same two criterions for arc join as in difference constraint. As for the center distance constraint, only A2
Section 2.4 but with relaxed constraints: (1) Radius difference and A4 satisfy it, while A3’s center and A5’s center fall out of the
constraint: If A2’s radius is within 50% of A1’s radius, then A2 is center distance threshold rT. So in Fig. 6(b), only arcs A2 and A4
taken as a candidate for join; otherwise A2 cannot be joined with A1. would be selected as candidates for joining with A1.
As an example, if A1’s radius is 100, then all arcs whose radii are After the computation of the candidate arcs, the next step is to
between 50 and 150 would be taken as candidates for arc join. combine them one-by-one with the extended arc A1 by fitting a
(2) Center distance constraint: We require that the distance between new ellipse to the pixels making up both of the arcs using the
C. Akinlar, C. Topal / Pattern Recognition 46 (2013) 725–740 731
ellipse fitting algorithm in [65]. If the root mean square error is 2.6. Circle and ellipse validation by the Helmholtz principle
smaller than 1.5 pixels, the join succeeds; otherwise, it fails. Instead
of trying the join in random order, we start with the arc whose In the last two sections we described how EDCircles joins arcs
either end-point is the closest to either end-point of A1. The into circle or ellipse candidates. Just because several arcs can be
motivation for this decision comes from the observation that if joined into circles and ellipses does not mean that all candidates
there is more than one arc that is part of the same great ellipse, it is are valid detections. So as a final step, EDCircles employs a circle
better to start the join with the arc closest to the extended arc A1. In and ellipse validation algorithm to eliminate invalid detections
Fig. 6(a) for example, we would first join A1 with A3 and then with and return only the valid circles and ellipses.
A4. Similarly, in Fig. 6(b) we would first join A1 with A2 and then A4. Before moving on to circle and ellipse validation algorithm
After an arc A1 is extended with other arcs on the same great employed by EDCircles, we first need to define the Helmholtz
ellipse, we decide at the last step whether to make the extended arc principle, lay out the foundation for its a contrario validation
an ellipse candidate. Here, we take the view that if the extended arc framework, and describe how it is used to detect meaningful
spans at least 50% of the circumference of its great ellipse, then we alignments, i.e., line segments, in a given image. We then adapt
make the arc an ellipse candidate. Otherwise, the arc is eliminated the line validation technique to circle and ellipse validation.
from further processing. In Fig. 6(a) for example, when A1 is joined Helmholtz principle simply states that for a geometric struc-
with A3 and A4, the extended arc spans more than 50% of the ture to be perceptually meaningful, the expectation of this
circumference of its great ellipse. So the extended arc is made an structure (grouping or Gestalt) by chance must be very low in a
ellipse candidate. In Fig. 6(c) however, when A1, A2 and A3 are joined random situation [59,60]. This is an a contrario approach, where
together, we observe that the extended arc does not span at least the objects are detected as outliers of the background model. As
50% of the circumference of its great ellipse, i.e., y1 þ y2 þ y3 o p. So shown by Desolneux et al. [60], a suitable background model is
the extended arc is not taken as an ellipse candidate. Computation of one in which all pixels are independent. They show that the
the total arc span is performed by simply looking at the ratio of the simplest such model is the Gaussian white noise. Stated in other
total number of pixels making up the joined arcs to the circumfer- words, no meaningful structure is perceptible in a Gaussian white
ence of the newly fitted ellipse. If this ratio is greater than 50%, then noise image [60].
the extended arc is taken as an ellipse candidate. Desolneux et al. use the Helmholtz principle to find meaning
To exemplify the ideas presented above, here is how the alignments, i.e., line segments, in a given image without requiring
remaining three arcs, A5, A6 and A7, in Fig. 4(a) and detailed in any parameters [61]. Their idea is to compute the level line
Table 1 would be processed: since A5 is the longest arc, we try orientation field (which is orthogonal to the gradient orientation
extending A5 with A6 and A7 as the candidates. Since the radii field) of a given image, and look for a contiguous set of pixels
of both A5 and A6 are within 50% of A5’s radius and their center having similar level line orientation. Fig. 7(a) shows the level line
distances are within the center distance threshold, both arcs orientation field for an image, where the aligned pixels that make
would be taken as candidates. We then try joining A5 and A6 up for a line segment are marked inside a rectangle. The authors
since A6’s end-point is closest to A5 (refer to Fig. 4(a)). Finally, A7 define what aligned means as follows: Two points (or line segments)
would be joined. Since the final extended arc covers more than P and Q have the same direction, i.e., aligned, with precision p ¼ 1=n
50% of its great ellipse, it is taken as an ellipse candidate. if angle(P) and angle(Q) are within p=p ¼ p=n degrees of each other.
Fig. 4(b) shows all circle and ellipse candidates for the image in Desolneux et al. state that ‘‘in agreement with psychophysics [68]
Fig. 1(a) overlayed in red color on top of the image. We have three and numerical experimentation, the realistic values of n range from
circle and two ellipse candidates of which the small circle, the 32 to 4 and it is, in general, useless to consider larger values of n00
small ellipse and the small polygon-like object became candidates [60]. In EDCircles, we fix the value of n to 8 and thus, p, the precision
at the first step of the algorithm due to their edge segment tracing or the accuracy of direction between two pixels, is equal to p ¼ 18 ¼
their entire boundary as a closed curve; the big circle became a 0:125 and two points are aligned (or p-aligned) if their angles are
candidate at the fourth step by joining the arcs A1, A2, A3 and A4 within pp ¼ p=8 ¼ 22:51 of each other.
into a perfect circle; and the occluded ellipse became a candidate The gradient magnitude and the level line angle at a pixel (x, y)
at the fifth step by joining the arcs A5, A6 and A7 into an ellipse. All are computed using a 2 2 mask as follows [61]:
five candidates would now have to go through validation by the
Helmholtz principle (described in the next section) before being Iðx þ 1,yÞIðx,yÞ þIðx þ 1,yþ 1ÞIðx,yþ 1Þ
g x ðx,yÞ ¼ ð1Þ
returned as true positive detections. 2
Fig. 7. (a) Level line orientation field (orthogonal to the gradient orientation) of an image. Aligned pixels (inside the rectangle) clearly make up for a line segment.
(b) Another level line orientation field. The circular structure (inside the two circles) is clearly visible. (c) Illustration of several p-aligned and not p-aligned gradients.
Observe that aligned gradients are perpendicular to the level-line angle and are inside the tolerance cone of 2pp.
732 C. Akinlar, C. Topal / Pattern Recognition 46 (2013) 725–740
Iðx,yþ 1ÞIðx,yÞ þ Iðx þ1,y þ1ÞIðx þ 1,yÞ If the observed gradient direction is inside the cone, then the
g y ðx,yÞ ¼ ð2Þ
2 point is assumed to be aligned with the circle, otherwise the point
qffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffi is assumed to be non-aligned with the circle.
gðx,yÞ ¼ g x ðx,yÞ2 þ g y ðx,yÞ2 ð3Þ Instead of treating the circles and ellipses as two different gestalts
to be validated, we formulate our problem simply as an ellipse
! validation problem since a circle is essentially an ellipse whose major
g x ðx,yÞ
a ¼ +g x g y ¼ arctan ð4Þ and minor axis lengths are equal. With this in mind, we adapt the
g y ðx,yÞ
definition of the number of false alarms (NFA) of a line segment to an
ellipse as follows: let E be an ellipse having a circumference of n
where I(x,y) is the intensity of the input image at pixel (x, y), g(x, y) is
points with at least k points having their directions aligned with E in
the gradient magnitude, and angle(x, y) is the level line angle.
an image of size N N pixels. Define NFA of E as
Authors in [61] state that the reason for using such a simple gradient
Xn
operator is to reduce the dependency of the computed gradients, n
NFAðn,kÞ ¼ N5 pi ð1pÞni ð6Þ
and thus, preserve pixel independence as much as possible. i
i¼k
To make validation by the Helmholtz principle concrete,
Desolneux et al. define what is called the number of false alarms where N5 represents the total number of potential ellipses in an
(NFA) of a line segment as follows [61]: Let L be a line segment of N N image. This is due to the fact that an ellipse has 5 degrees of
length n with at least k points having their directions aligned with freedom, i.e., its center coordinates, the major axis, the minor axis,
the direction of L in an image of size N N pixels. Define NFA of L and the orientation; hence, a total of N5 ellipses.
as [62,63] Given this NFA definition, we validate a circle/ellipse (circle for
short) as follows: For a circle having length n, compute the level
Xn
n line angle of each pixel along the circle and count the number of
NFAðn,kÞ ¼ N4 pi ð1pÞni ð5Þ
i¼k
i aligned pixels k. Compute NFAðn,kÞ and accept the circle as valid
if NFAðn,kÞ r E; otherwise the circle is rejected. It is important
where N4 represents the number of potential line segments in an to point out that we perform the gradient computation in the
N N image. This is due to the fact that a line segment has two validation step over the original non-filtered image as required by
end-points, each of which can be located in any of the N2 pixels the a contrario framework.
of the image; thus, a total of N2 N2 ¼ N 4 line segments. The The epsilon ðEÞ in the above comparison denotes the expected
probability p used in the computation of the binomial tail is the number of detections under the background model. In other
accuracy of the alignment between the pixel’s level line angle and words, if Gaussian white noise image is fed into the algorithm,
the line segment. we should get at most E many detections. We set E to 1 as advised
An event (a line segment in this case) is called E-meaningful if its by Desolneux et al. [59,60], which corresponds to one false
NFAðn,kÞ r E. Desolneux et al. [59–61] advises setting E to 1, which detection per image.
corresponds to one false detection per image. Given these defini- While the observed level line angle is computed by the 2 2
tions, a line segment is validated as follows: for a line segment of mask given above, computing the ideal pixel angle at a point on
length n, compute the level line angle of each pixel along the line the circumference of a circle/ellipse requires the computation
and count the number of aligned pixels k. Then, compute NFAðn,kÞ of the tangent line at that point. If the level line angle and the
and accept the line segment as valid if NFAðn,kÞ r 1. Otherwise, tangent line angle are within pnp ¼ 0:125n180 ¼ 22:51 of each
reject the line segment. other, then they are aligned; otherwise they are not.
Desoulneux et al.’s line segment validation framework outlined
above has successfully been used by two recent line segment 2.7. Complexity of EDCircles
detectors; namely, LSD by Grompone von Gioi et al. [62,63], and
EDLines by Akinlar et al. [56,57], to validate the detected line Given an N N image, the first step of EDCircles is the edge
segments and eliminate false detections. segment detection by EDPF, which is an O(N2) operation. The
In this paper, we adapt Desolneux et al.’s line segment detected edge segments are then converted to line segments, which
validation framework outlined above to circle and ellipse detec- is a linear time operation on the number of edge pixels. Assuming
tion. The idea is very simple: just like a contiguous set of pixels we have L line segments, conversion of the line segments into arcs is
having level line angles aligned with a line segment make up for a again a linear time operation on the number of line segments; that
line segment, a contiguous set of pixels having level line angles is, O(L). If we now have A arcs, the rest of the algorithm first involves
aligned with a circle make up for a circle. Fig. 7(b) shows the level sorting the arcs with respect to their length, which can be done in
line orientation field of an image, where the aligned pixels O(A log A), and then taking the longest remaining arc from the
that make up for a circle are marked inside two circles. To define sorted list, finding the candidate arcs in O(A), and joining the
alignment between a pixel and a circle, we simply adapt the candidate arcs with the target arc using the greedy heuristic. This
alignment definition between a pixel and a line segment: A pixel is a quadratic operation on the number of arcs; that is, O(A2).
P on the boundary of a circle is aligned with the circle, if P is
aligned with the tangent to the circle at P. Recall from above
that a point P and a line segment L is aligned with precision p, 3. Experiments
if angle(P) and angle(L) are within pnp degrees of each other.
Assuming that the tangent line of a circle at a given point P is T, To measure the performance of EDCircles [54,55], we take both
we can simply use this definition for alignment between a point synthetic and natural images containing circular objects and feed
and a circle; that is, angle(P) and angle(T) have to be within pnp them into our algorithm. We then show the detected circles and the
degrees of each other if P and the circle are aligned. running time of EDCircles. We stress that EDCircles is parameter-
Fig. 7(c) shows the gradient directions (perpendicular to the free in the sense that it has one set of internal parameters used for
level line angles) of several points on the boundary of a circle, all images presented in this paper and on the EDCircles demo Web
some of which are p-aligned with the circle, and some of which site at http://ceng.anadolu.edu.tr/CV/EDCircles/Demo.aspx.
are not. The gray triangle illustrates the tolerance cone between Fig. 8 shows the performance of EDCircles and cvHoughCir-
the ideal gradient direction and the observed gradient direction. cles on five synthetic and natural images containing circular
C. Akinlar, C. Topal / Pattern Recognition 46 (2013) 725–740 733
Fig. 8. Circle detection results by EDCircles (4th column), and OpenCV cvHoughCircles (last column). (For interpretation of the references to color in this figure caption,
the reader is referred to the web version of this article.)
objects. The running times were measured in a PC with a 2.2 GHz cvHoughCircles. The reason for choosing OpenCV [66] for compar-
Intel 2670QM CPU. The first column shows the original image, ison is because OpenCV’s implementations of generic image proces-
and the second column shows the edge segments detected by sing and computer vision algorithms are known to be the fastest
EDPF. In the third column, the candidate circles and ellipses in the literature and are widely used, and because OpenCV is open
(circles for short in the rest of this paper) are overlayed on top source so that anyone can repeat the same results. We note that
of the image with red color. Notice that there are many false cvHoughCircles has many parameters (similar to many other circle
candidate circles, especially visible on the eye and traffic sign detection algorithms in the literature) that must be pre-supplied by
images. The fourth column shows the final result output by the user for each image. To obtain the results by cvHoughCircles,
EDCircles. Notice that the circle validation algorithm due to the we first smoothed the images with a 5 5 Gaussian kernel with
Helmholtz principle eliminates all invalid circle candidates leav- s ¼ 1:0. We then tried many different parameters and present the
ing only valid circle detections. For example, there are 29 best results in Fig. 8. Table 3 lists the parameters used to obtain
candidate circles in Image4, 17 of which are false detections; cvHoughCircles’ results shown in Fig. 8. Notice from Table 3 that
therefore, EDCircles outputs only 12 valid circles. Notice that the cvHoughCircles requires a different set of parameters to obtain the
detected circles are of various sizes from very small to large. best results for each image. With a single set of default parameters,
The fifth column in Fig. 8 shows the circles detected by OpenCV’s cvHoughCircles either fails to detect many valid circles or produces
Circle Hough Transform (CHT)-based circle detection algorithm, a lot more false detections, which are not shown in this paper.
734 C. Akinlar, C. Topal / Pattern Recognition 46 (2013) 725–740
Table 2
Dissection of EDCircles’s running time and its comparison to cvHoughCircles.
Image (width height) Edge segment detection Circle detection and Total (ms) cvHough Speedup
by EDPF (ms) validation (ms) Circles (ms)
Table 3 ‘stability-ball’ images and the central ball in ‘gobang’ image, and
cvHoughCircles’s parameters used for each image in Fig. 8. runs 5 times faster than GRCD-R on the average. Refer to Table 4 for
a side-by-side comparison of the execution times of GRCD-R and
Image dp Minimum Canny Accumulator
EDCircles for the 10 images in Figs. 9 and 10. It is important to note
distance high threshold
between circles threshold
that the execution times for GRCD-R include just the circle detection
after edge detection, whereas the execution times for EDCircles
Image1 1 16 120 140 include both the edge segment detection by EDPF and the following
Image2 1 8 80 100 circle detection and validation. Considering that at least half of the
Image3 1 2 100 120
running time of EDCircles is spent on edge segment detection and
Image4 1 2 120 80
Image5 1 2 140 90 that EDCircles was run on a slower CPU, the actual performance of
EDCircles compared to GRCD-R is better than reflected in Table 4.
Also notice that EDCircles detects circles of various sizes in an
It is clear from Fig. 8 that EDCircles detects most valid circles in image, while GRCD-R fails in detecting many valid circles. This
real-time, whereas cvHoughCircles not only fails in detecting is more evident in ‘logo’ and ‘speaker’ images of Fig. 10, where
many valid circles but also produces many false detections. This is GRCD-R detects only the big circles, while EDCircles detects many
more evident in Image1, where cvHoughCircles takes close to 1 s circles having small and large radius.
to execute and outputs 71 circles many of which are false detections. One disadvantage of EDCircles is multiple circle detections
We stress once again that EDCircles is parameter-free, i.e., it uses the around circles having fuzzy boundaries. This is more evident in
same internal parameters for all images; whereas, to get the best ‘coin’ and ‘cake’ images of Fig. 9, where EDCircles detects two
results with cvHoughCircles, the user has to supply a different set circles around two different coins, and two circles around three
of parameters for each image by trial-and-error. different circular objects. Notice that those objects have shadows
The last row, i.e., Image5, in Fig. 8 requires special attention as it around their boundaries, which are detected by EDPF and turned
shows a circular traffic sign from the Sweedish Traffic Sign database into circles by EDCircles. The reason GRCD-R does not detect
[67]. Due to the view-point of the camera, this circular sign appears multiple circles around the same objects is probably because the
as an ellipse rather than a circle. Notice from Fig. 8 that EDCircles edge maps used by GRCD-R was cleaner and did not include these
detects this elliptic circle correctly; however, a traditional circle shadows as edges. Since we do not know about the edge maps
detector such as cvHoughCircles detects three different circles used in GRCD-R’s circle computations, we are not in a position
none of which correctly represents the traffic sign. to make a fair comparison between GRCD-R and EDCircles in
Table 2 dissects the running time of EDCircles for each image this sense.
in Fig. 8 and compares it with the running time of cvHoughCir- To evaluate the accuracy of EDCircles, we run the traditional
cles. Clearly, EDCircles runs real-time for typical camera input circular Hough transform (CHT) [12] on the 10 test images shown
sizes of 640 480 with much of the time spent on edge segment in Figs. 9 and 10, and obtain the ground truth results. To make the
detection by EDPF rather than circle detection and validation. Also CHT results as accurate as possible, we set granularity of the
note that EDCircles is up to 14 times faster than cvHoughCircles, x-coordinate, y-coordinate and the radius to one pixel precision.
which is probably the fastest implementation of the classical CHT Table 5 shows the average differences between the parameters of
algorithm. each circle detected by CHT and those detected by GRCD-R and
We next compare EDCircles to a randomized circle detection EDCircles for each of the test images. As seen from the results,
(RCD) algorithm first proposed in [26] and recently improved in EDCircles has sub-pixel accuracy, although it does not produce as
[27,28], where the authors discuss different RCD variants and accurate results as GRCD-R.
present the results for the variant called GRCD-R. Unfortunately, In the next experiment, we measure the performance of
there are no publicly available implementations of this algorithm. EDCircles in detecting near-circular elliptic objects. As we pointed
So we take the test images from the author’s original paper along out before, depending on the viewpoint of the camera, circular
with their results [28], feed the test images into EDCircles and objects may appear slightly elliptical in the image. A good circle
compare the results. detector should be able to take care of such cases.
Figs. 9 and 10 show the 10 test images and the results produced Fig. 11 shows the performance of EDCircles in detecting near-
by GRCD-R taken from [28], and the results produced by EDCircles. circular elliptic objects in four images. Two of the images are from
Chung et al. state that the running times for GRCD-R’s results were a pupil detection application; the other two are from the Sweedish
obtained in a 3 GHz Intel E8400 CPU [28]. To make a fair running Traffic Sign Database [67] and contain several circular traffic signs.
time comparison between GRCD-R and EDCircles, we run EDCircles Notice that all circular objects, which appear slightly elliptical in
in a 2.2 GHz Intel E4500 CPU, which is also a Core 2 duo CPU the images due to the viewpoint of the camera, have been detected
similar to E8400, but has a slower clock speed and lower CPU rating with no false detections. We would like to stress though that
according to CPU benchmarks. EDCircles is not a general ellipse detection algorithm and is not
It is clear from the results in Figs. 9 and 10 that EDCircles detects intended to be one. Rather, EDCircles tries to detect circular objects
most valid circles except for some light reflections in ‘gobang’ and which look slightly elliptical due to the view-point of the camera.
C. Akinlar, C. Topal / Pattern Recognition 46 (2013) 725–740 735
Fig. 9. (1st row) Five test images, (2nd row) circle detection results by GRCD-R, and (3rd row) circle detection results by EDCircles. The test images, and GRCD-R’s circle
detection results and running times were taken from the authors’ original paper [28]. The running times for GRCD-R were obtained in a 3 GHz Intel E8400 CPU and include
just the circle detection after edge detection, whereas the running times for EDCircles were obtained in a 2.2 GHz Intel E4500 CPU and include both the edge segment
detection and the circle detection. Since E4500 is a slower CPU than E8400, EDCircles’s performance compared to GRCD-R is better than reflected here.
Informally, we can say that EDCircles tries to detect ellipses where an arc from the inner circle. The images in the first and second
the major axis is at most twice as big as the minor axis. This is due column of Fig. 12 illustrate another weak point of EDCircles; the
to the arc join heuristic algorithm outlined in Section 2.5, which only failure in detection of small divided circles. Notice that in both
joins two arcs for an ellipse if the arcs’ radii differ by at most 50% images EDCircles fails in detecting the small circles at the center of
from each other. If a circular object appears way too elliptic due to the images. This has to do with EDCircle’s arc generation strategy.
the viewpoint of the camera, EDCircles would not be able to detect Recall from Section 2.3 that an arc is generated with at least three
it. Arbitrary ellipse detections shown on EDCircles Web site [55] are lines that turn in the same direction and satisfy the angle con-
mostly due to the entire boundary of the ellipse being detected as a straints, which is generally not satisfied in small circles and the
single closed edge segment. detection fails. The third column in Fig. 12 shows three blood cells
Fig. 12 examines several failure cases for EDCircles. The first with fuzzy boundaries, where detection either fails or produces non-
column shows a set of tightly concentric circles divided into eight perfect results. With objects having fuzzy boundaries, EDPF, the
quarters, which provokes EDCircles’s arc join heuristics. As seen edge segment detector employed by EDCircles, produces ragged
from the results, in several cases EDCircles incorrectly joins an arc edge segments, which cannot be turned into arcs, and the detection
from an outer circle with an arc from an inner circle resulting in fails. The last column in Fig. 12 shows a spiral and the circles
circles that span half of an outer circle and half of an inner circle. detected by EDPF. As the spiral turns, EDCircles generates many arcs,
The reason for this has to do with the greedy join heuristic which are combined together into the circles shown in the figure.
employed by EDCirles. Recall from Section 2.4 that while EDCir- We again see arcs from an inner circle being joined with arcs from
cles tries to extend an arc, it generates a set of candidate arcs and an outer circle. The validation due to the Helmholtz principle is also
joins the currently extended arc with the candidate arc whose not able to eliminate these false detections either.
endpoint is the closest. If the join succeeds, the arcs are immedi- Our last experiment is to measure the performance of EDCir-
ately joined, and the joined arcs are never separated afterwards. cles in noisy images. To this end, we take an image containing
This is a greedy heuristic and is chosen for performance reasons. several small and big circles, add varying levels of Gaussian white
Although this heuristic results in good circle detections in many noise to the image, and feed the images to EDCircles.
cases, it fails especially if there are tightly coupled concentric Fig. 13 demonstrates the performance of EDCircles as the
circles divided into two or more regions as in Fig. 12. A similar amount of Gaussian white noise is increased over a sample image
problem can also be observed in the natural image of ‘Drain containing a big circle and several small circles. Notice that with
Cover’, where EDCircles again joins an arc from an outer circle with increasing noise, the detection of the small circles starts to fail;
736 C. Akinlar, C. Topal / Pattern Recognition 46 (2013) 725–740
Fig. 10. (1st row) Five test images, (2nd row) circle detection results by GRCD-R, and (3rd row) circle detection results by EDCircles. The test images, and GRCD-R’s circle
detection results and running times were taken from the authors’ original paper [28]. The running times for GRCD-R were obtained in a 3 GHz Intel E8400 CPU and include
just the circle detection after edge detection, whereas the running times for EDCircles were obtained in a 2.2 GHz Intel E4500 CPU and include both the edge segment
detection and the circle detection. Since E4500 is a slower CPU than E8400, EDCircles’s performance compared to GRCD-R is better than reflected here.
Table 4 Table 5
Execution-time performance comparison between GRCD-R and EDCircles in terms Average differences between the parameters of each circle detected by circular
of milliseconds for the 10 images in Figs. 9 and 10. The execution times for GRCD- Hough transform (CHT) and those detected by GRCD-R and EDCircles for the 10
R include just the circle detection after edge detection, whereas the execution test images in Figs. 9 and 10.
times for EDCircles include both the edge segment detection by EDPF and the
following circle detection and validation. Image GRCD-R EDCircles
until at Gaussian noise with s ¼ 60, only the big circle is detected. in less noisy images. This means that the arcs approximating the
Increasing the noise further causes complete detection failure, boundary of a circle would be short and cannot be joined together
which is not shown in the figure. The reason for the detection to make up for the circle. Notice though that there are no false
failure comes from the fact that as the noise is increased, the detections in none of the images, which is also very important. We
boundaries of the circles are approximated by many short edge also observe that the running time of EDCircles increases in noisy
segments instead of a few long edge segments as would be the case images. The reason for this is the increased edge segment detection
C. Akinlar, C. Topal / Pattern Recognition 46 (2013) 725–740 737
Fig. 11. The performance of EDCircles in detecting near-circular elliptic objects in four images. Notice that all objects have been detected successfully in real-time without
any false detections.
time by EDPF. As the amount of noise increases in an image, EDPF when strong noise is present, a denoising or zoom-in step would
takes more time to compute the edge segments because many pixels help in the detection of many valid circles by EDCircles.
start becoming potential edge elements. Circle detection after edge
segment detection remains pretty constant across all images.
Fig. 14 shows the results of analyzing a very noisy image 4. Conclusions
(containing a Gaussian noise with s ¼ 100) at different scales. At
full scale, no structure is detected, although they are visible to a EDCircles is a real-time, parameter-free circle detection algo-
human observer. This is because human visual system performs a rithm that works by post-processing the edge segments detected
multiscale analysis and is able to detect the structures inside by our parameter-free edge segment detector, edge drawing
strong noise. Running EDCircles at half and quarter resolutions, parameter free (EDPF). EDCircles first converts each edge seg-
we see that several circles are now detected. We conclude that ment, which is a contiguous chain of pixels, into a set of line
738 C. Akinlar, C. Topal / Pattern Recognition 46 (2013) 725–740
Fig. 13. The performance of EDCircles as the amount of Gaussian white noise is increased up to s ¼ 60. Notice that with increasing noise, the detection of the small circles
starts to fail; until at noise with s ¼ 60, only the big circle is detected. Increasing the noise further causes complete detection failure, which is not shown in the figure.
Fig. 14. The results of analyzing a very noisy image (containing a Gaussian noise with s ¼ 100) at different scales. At full scale, no structure is detected, but no false
detections are done either. At 1/2 resolution, the big circle is detected. At 1/4 resolution, three circles are detected.
segments. The line segments are then converted into circular arcs, inspection of manufactured products, pupil detection, circular
which are joined together using two heuristic algorithms into traffic sign detection, and similar such applications.
candidate circles and near-circular ellipses. At the final step, all
candidate circles are validated using the Helmholtz principle,
which eliminates false detections leaving only valid circle detec- Acknowledgments
tions. Experiments shown in this paper (and others that the
reader may wish to perform online at http://ceng.anadolu.edu. We are deeply indebted to anonymous reviewers for their
tr/CV/EDCircles/Demo.aspx) show that EDCircles have high detec- insightful comments, which greatly helped shape this paper for
tion rates, good accuracy, runs in blazing speed and produces only the better. We also thank the Scientific and Technological Research
a few or no false detections. We expect EDCircles to be widely Council of Turkey (TUBITAK) for supporting this work with the Grant
used in such real-time automation tasks as the automatic no. 111E053.
C. Akinlar, C. Topal / Pattern Recognition 46 (2013) 725–740 739
Cuneyt Akinlar received his B.Sc. degree in Computer Engineering from Bilkent University in 1994; M.Sc. and Ph.D. degrees in Computer Science from the University of
Maryland, College Park in 1997 and 2001, respectively. He is currently an Assistant Professor in Anadolu University, Computer Engineering Department (AU-CENG). Before
joining AU-CENG, he worked at Panasonic Technologies, Princeton, NJ, and Siemens Corporate Research, Princeton, NJ as student intern and research scientist between
1999 and 2003. His current research interests include real-time image processing and computer vision algorithms, high-performance computing, storage systems and
computer networks.
Cihan Topal received the B.Sc. degree in Electrical Engineering and M.S. degree in Computer Engineering, both from Anadolu University, in 2005 and 2008 respectively. He
worked as a student intern at Siemens Corporate Research, Princeton, New Jersey. He is currently working at Anadolu University Computer Engineering Department,
Turkey, towards his Ph.D. degree. His research interests are in the area of image processing and computer vision applications, and pattern recognition.