Robust Skeletonization of Hand Written Craft Motives of “Zellij

Transcription

Robust Skeletonization of Hand Written Craft Motives of “Zellij
IJCSNS International Journal of Computer Science and Network Security, VOL.9 No.8, August 2009
216
Robust Skeletonization of Hand Written Craft Motives of “Zellij”
Using Racah Moments
Khalid Fardousse, Annass. El affar, Hassan Qjidaa and Abdeljabar. Cherkaoui
Université Sidi Mohamed Ben Abdellah Faculté des Sciences Dhar El Mehraz Fès
LESSI, BP 1796, Fez, MORROCCO
Abstract
In this paper, we propose a novel approach to robust
skeletonization. The proposed statistical method is based on the
estimation of the probability density function (pdf) by the Racah
moment theory controlled by Maximum Entropy Principle (MEP).
This new proposed Racah Moment Skeletonization Method
(RMSM) is valid for gray level craft motives. Detailed analysis of
skeletonization process is presented to show its superior
performance related to noise immunity.
Keywords
Skeletonization, Racah moment, p.d.f., MEP, Gray level images,
Noise immunity, handwritten Craft Motives.
1. Introduction
SKELTONIZATION has been a part of image
processing for a wide variety of applications [1]. The
usefulness of reducing patterns to thin line representation
can be attributed to the need to process a reduced amount of
data as well as to the fact that shape analysis can be more
easily made on thin-line patterns. The thin-line
representation of certain elongated patterns, like craft
motives , would be closer to the human perception of these
patterns; therefore, they permit a simpler structural analysis
and more intuitive design of recognition algorithms. Since
the first study by Blum [2], the skeletonization of shapes has
attracted attentions from many researchers in various fields.
Commonly used computational methods for skeleton
extraction include topological thinning [1], [3-4],
approaches based on distance transform [5], [6], hierarchical
methods based on Voronoi diagrams [7], voxel coding
based methods [8], some approaches based on physical
simulations [9] and principal curves based methods [10].
However, most of these techniques of skeletonization
present two major drawbacks. The first one, is their high
sensitivity to noise. In fact, all these algorithms of
skeletonization are inefficient in the presence of noise in the
processed images. The second is that most of
skeletonization approaches proceed by binarization of the
input images, which involve the ignorance of the gray level
information. In fact, binarization of gray level images may
remove important topological information from craft, which
as result, leads to inaccurate skeletonization of the original
images. The few algorithms using the gray level information
Manuscript received August 5, 2009
Manuscript revised August 20, 2009
in skeletonization are summarized in Verwer's survey [11].
Some previous efforts on this topic were addressed by Levi
and Montanari [12]. Recently, an important contribution
considering skeletonization of gray level images was
reported in [13] and [14], where the authors proposed
Skeleton Growing (SG) approach carrying out the “flooding
water” problem. However, this method uses information
from two sources: the original gray level image and the
binary images resulting from the binarization operation,
which involve several intermediate steps due to the use of
sequence of binarization thresholds.
In this paper, the skeletonization approach is developed
using a statistical method based on the estimation of
probability density function (pdf) where the skeleton is
defined as the local maxima of this p.d.f.. The Main goal of
our work is to compute the skeleton directly from an image,
without any a priori information and intermediate steps
(binarization, filtering) about this latter.
Our proposed approach is based on the expansion of a
multivariate function p.d.f. in terms of Racah polynomials
by means of Racah moment [15], [16]. For this purpose, the
p.d.f. is approximated by a truncated series of polynomials.
As the determination of the expansion order is a difficult
problem [17], [18], we propose to estimate the p.d.f. for
different orders and to select the optimal one as the one for
which the entropy reaches a maximum according to the
Maximum Entropy Principal MEP [17-20]. Having the
optimal p.d.f., the true points of the skeleton are the local
maxima of this latter.
As a summary, our proposed RMSM skeletonization
method based on the combination of the moment theory and
MEP as a selection criterion is composed of the three
following steps:
1.
Computation of the p.d.f. using the Racah moment.
2.
Estimation of the optimal p.d.f. using MEP method.
3.
Extraction of the local maxima of the optimal p.d.f.
taken as the skeleton points.
The most important advantages of our method are the
following:
•
No a priori information and intermediate steps
about the original image is required.
IJCSNS International Journal of Computer Science and Network Security, VOL.9 No.8, August 2009
•
High robustness against noisy images.
•
ρ (s) =
Applicable for binary as well as for gray level
images.
(3)
•
Elimination of the “flooding water” problem [13],
[14].
The paper is organized as follows: the next section
describes the basis of our statistical model, using Racah
moment. The maximum entropy principal is given in
section III. The details of our skeletonization algorithm are
presented in section IV. Section V presents the main results
and performances of our skeletonization method. Finally,
section VI deals with the summary of important results and
conclusions of this work.
dn2 =
217
Γ(a + s + 1)Γ( s − a + β + 1)Γ(b + α − s)Γ(b + α + s + 1)
Γ(a − β + s + 1)Γ( s − a + 1)Γ(b − s)Γ(b + s + 1)
Γ(α + n +1)Γ(β + n +1)Γ(b − a + α + β + n +1)Γ(a + b +α + n +1)
n = 0,1,…, L -1
(α + β + 2n +1)n!(b − a − n −1)!Γ(α + β + n +1)Γ(a + b − β − n)
(4)
⎛ − n,α + β + n +1, a − s, a + s +1 ⎞
1
un(α,β (s, a, b) = (a − b +1)n (β +1)n (a + b +α +1)n ×4 F3 ⎜⎜
;1⎟⎟
n!
⎝ β +1, a +1− b, a + b +α +1 ⎠
n = 0,1,…, L -1; s = a, a +1,…, b -1;
(5)
The generalized hyper geometric function 4F3(·) is given by
k
(a1 ) k (a 2 ) k (a 3 ) k (a 4 ) k z
(b1 ) k (b1 ) k (b1 ) k
k!
k =0
∞
4 F3 ( a1 , a 2 , a 3 , a 4 ; b1 , b2 , b3 ; z ) = ∑
(6)
2. Statistical modelisation using Racah
Moments
Moment functions have been used as shape descriptors
in a variety of applications in image analysis, like visual
pattern recognition [21], object classification [22], template
matching [23], edge detection [24], robot vision [25], and
data compression [26]. In all these applications, geometric
moments and their extensions in the form of radial and
complex moments have played important roles in
characterizing the image shape, and in extracting features
that are invariant with respect to image plane
transformations. Teague [27] introduced moments with
orthogonal basis functions, with the additional property of
minimal information redundancy in a moment set. More
recently, an important and significant work considering
moments for pattern reconstruction was performed. In this
study, the error analysis and characterization of Legendre
moments descriptors have been investigated, where several
new techniques to increase the accuracy and the efficiency
of the moments are proposed. Based on these improvements,
In this section, Racah moments are defined and their
properties.
A. Racah Moment
The (n+m)
order Racah moment of an image
f ( s, t ) with size N×N is defined as [16]
b−1 b−1
(α ,β )
n
(α ,β )
m
(s, a, b)uˆ
(t, a, b) f (s, t )
n, m = 0,1,…, L -1,
(1)
,
− 1 < β < 2a + 1 , b = a + N
.
B. Estimation of the Probability density function
The orthogonality property of Racah polynomials helps
in expressing the image intensity function f ( s, t ) in terms
of its Racah moments. The image reconstruction can be
obtained by using the following inverse Racah moment
transform
L −1 L −1
f ( s, t ) = ∑∑U nmuˆn(α , β ) ( s, a, b)uˆm(α , β ) (t , a, b) s, t = a, a + 1,..., b − 1
n =0 m=0
(7)
where (s, t) represents the uniform pixel grid of image.
When only moments of order up to M are used, the image
intensity function f(s,t) is approximated by
~
f ( s, t ) =
M
M
∑ ∑U
n=0 m =0
nm
(
uˆ n(α , β ) ( s , a , b )uˆ m(α , β ) (t , a , b ) s , t = a , a + 1,..., b − 1
8)
If only Racah moments of order ≤θ are given, The image
function reconstructed from U nm can be approximated by a
truncated series [16]:
~
fθ (s, t ) =
θ
n
∑ ∑ Uˆ
n=0 m =0
(n−m )m
uˆ n( α− m, β ) ( s , a , b ) uˆ m( α , β ) ( t , a , b ) (
The estimated probability density function (pdf) for a
given order
the set of weighted Racah polynomials being defined as
uˆn(α ,β ) (s, a, b) = un(α ,β ) (s, a, b)
(2)
− 1/ 2 < a < b , α > − 1
9)
Unm = ∑∑uˆ
s=a t =a
and the parameters a ,b, α, and β are restricted to
θ
~
fθ ( s, t ) [17]:
ρ ( s) ⎡
1 ⎤
Δx(s − )⎥ n = 0, 1,…, L -1,
2⎦
d n2 ⎢⎣
uˆ θ( α , β ) ( s , t ) =
denoted Uˆ ( n− m ) m is obtained by normalizing
~
fθ (s, t)
~
∑ fθ (s, t)
s ,t∈Ω
(10)
IJCSNS International Journal of Computer Science and Network Security, VOL.9 No.8, August 2009
218
Where
Ω
~
∑ fθ (s, t) =1
0 ≤ uˆθ( α , β ) ( s , t ) ≤ 1 ,
and
s,t∈Ω
is the image plane.
The estimated p.d.f. depends only on the expansion
order, a criterion for choosing this order is explained in the
next section according to the maximum entropy principal
MEP.
Initialize
2.
∧
Compute the p.d.f. pθ and its corresponding
∧
Shannon entropy S( pθ )
∧
3.
If S ( p θ ) is maximum, then
∧
3. Optimal Order Moments Selection using
MEP
As addressed in [17], [18] the determination of the
expansion order is a difficult problem and computationally
expansive, because we ignore the order of the truncated
expansion of f(x, y) which gives a good quality of the
estimated input image function.
For this purpose, we introduce the maximum entropy
principle MEP for the search of this optimal order. This
automatic technique can estimate the optimal number of
moments directly from the available data and does not
require any a priori image information specially for noisy
images.
Let G
be a set of estimated underlying probability
w
density function for various Racah moment orders:
∧
(11)
G w = { p θ / θ = 1 ........ ω }
By applying the maximum entropy principle for noisy
images, we deduce that among these estimates of the
probability density function, there is one and only one
∧ ∗
probability density function denoted p θ ( s , t ) whose
entropy is maximum [17], [28] and which represents the
optimal probability density function, and then gives the
optimal order of moments.
∧∗
The Shannon entropy of pθ ( s, t ) is defined as:
∧
S( pθ ) =
−
∑
s ,t∈Ω
∧
∧
p θ ( s , t ) log( p θ ( s , t ))
θ
1.
θ
is optimal and
∧∗
pθ = pθ , else θ =θ +1 and go to 2.
∧∗
Then, having pθ , we assign to each point of the data
∧∗
space, the optimal p.d.f. p θ ( s, t ) defined by (8). In this
case, the “good data” are the set of points belonging to the
∧∗
∧∗
mode of p θ . By extracting the local maxima of pθ , we
can determine the exact points of the skeleton. In the next
section, the details of our skeleton extraction algorithm are
presented.
4. Skeleton Extraction
We define the skeleton as the local maxima of the
estimated probability density function selected in the
previous section. The extraction of these local maxima
allows us to determine the skeleton associated to the shape.
The general idea of this algorithm consists of a successive
points extraction presenting a local maxima of the selected
optimal p.d.f..
The procedure consists in making a sweep mask of size
3x3 on the image. The comparison of the estimated p.d.f.
for the central pixel of the mask with its close eight
neighbours following the eight directions (Fig.1), allows to
confirm whether this central pixel is a point of the skeleton
or not.
(12)
∧ ∗
and the optimal p θ is such that :
∧ ∗
S ( p θ ) = MAX
{S(
∧
∧
pθ ) / pθ ∈GW }
(13)
The process of determination the optimal order θ
consists in estimating the p.d.f. for different orders and
selecting the optimal one as the one for which the entropy
reaches maximum. The following is basic algorithm which
consists in an exhaustive search to determine the optimal
∧
order which maximizes S ( p θ ) :
Fig. 1. The pixel (i,j) and its eight close neighbour
IJCSNS International Journal of Computer Science and Network Security, VOL.9 No.8, August 2009
219
Indeed, two types of comparison are undertaken: a
comparison following lines and columns and a comparison
following the diagonal. A pixel is a point candidate if it
presents local maxima compared to its four neighbours
following the lines and column direction or if it presents a
local maxima compared to its four neighbours following the
diagonal direction.
The following algoritm shows the skeltonization steps
of the RMSM method.
Algorithm :
Begin
For i=1 to N
For j=1 to M
Fig. 3. Reconstructed image by Racah moments at order 40
if
∧∗
∧∗
∧∗
∧∗
pθ (i,j-1)< pθ (i,j) et pθ (i,j+1)< pθ (i,j)
The p.d.f. obtained by MEP corresponding to optimal
orders 40 are presented in Figure 4.
AND
∧∗
∧∗
∧∗
∧∗
pθ (i-1,j)< pθ (i,j) et pθ (i+1,j)< pθ (i,j)
Or
∧∗
∧∗
∧∗
∧∗
pθ (i-1,j-1)< pθ (i,j) et pθ (i+1,j+1)< pθ (i,j)
AND
∧∗
∧∗
∧∗
∧∗
pθ (i+1,j-1)< pθ (i,j) et pθ (i-1,j+1)< pθ (i,j)
En if
End
Fig. 4. Estimated Probability density function at order 40
5. Experimental Results
In this section, simulation results are carried out using
test sets. RMSM algorithm is tested for gray level image. A
subset of the well known craft database is used to
demonstrate the efficiency of our algorithm for craft
motives.
Figure 5 shows the skeletons obtained using RMSM
method where five samples was used for each craft motives.
Gray level craft motives. Figure 2 presented the craft
motive image.
Fig. 5: Extracted Skeleton of craft motive image
To see the ability of the proposed skeletonization
approach when applied to noisy gray level craft motives,
Fig. 2. Scanned Craft motive Noisy image
In the gray level image skeletonization process can
appear an effect called “flooding water” [13], [14]. This
phenomenon can greatly affect the produced skeletons
which can lead to an erroneous decision when using these
skeletons as input features in a recognition system. This
220
IJCSNS International Journal of Computer Science and Network Security, VOL.9 No.8, August 2009
effect appears when separated lines are touched or very
close to each other. In order to compare the RMSM method
with some results in the literature, we present some well
know algorithms that we have selected for their
applicability to OCR: Hilditch's algorithm [3], Zhang and
Suen's algorithm [29], Huang and al. algorithm [4] and Kegl
and Krzyzak algorithm [10]. The binarization used as
preprocessing phase in the studied algorithms tends to
consider the closed regions as unique shape. This produces
a skeleton that do not represents the real input shape which
might cause problems during the recognition process.
References
[1]
[2]
[3]
[4]
[5]
6. Conclusion
[6]
In this paper, we have proposed a novel approach to
robust skeletonization, based on a statistical method using
the Racah moment theory controlled by Maximum Entropy
Principle (MEP). This new concept of skeletonization is
articuled into three steps. In the first one, estimation of the
underlying probability density function (pdf) using Racah
moment is carried out. In the second, the estimation of
optimal p.d.f. is selected using MEP criterion. Finally, the
subset of local maxima pixels of the optimal p.d.f. are
selected as belonging to the skeleton. The advantage of our
algorithm is that no a priori information and intermediate
steps about the original image are needed. Hence, our
algorithm is designed for gray level and binary images.
Through a comparative study with other well established
algorithms, it performed quite well in experimental tests and
demonstrates a great robustness against high noise levels
and “flooding water” effect.
[7]
[8]
[9]
[10]
[11]
[12]
Acknowledgement
This research was supported by the CNRST ( Centre
National de la Recherche Scientifique et Technique ) of
morrocco under the PROTARS III program number :
D41/10. Also, the authors thank the following agencies for
the cooperation in the survey.
[13]
[14]
[15]
[16]
[17]
[18]
[19]
[20]
L. Lam, S.W. Lee, and C.Y. Suen, “Thinning Methodologies
-A Comprehensive Survey,” IEEE Trans. on Pattern
Analysis and Machine Intelligence, vol. 14, no. 9, pp. 869885, 1992.
H. Blum, “A Transformation for Extracting New Descriptors
of Shape,” Models for the Perception of Speech and Visual
Form (W. Wathen-Dunn, Ed.), MIT Press, pp. 363-380, 1967.
C.J. Hilditch. “Comparison of thinning algorithms on a
parallel processor,” Image Vision Computing, Vol.1, pp.115132, 1983.
L. Huang, G. Wan, and C. Liu, “An Improved Parallel
Thinning Algorithm,” ICDAR, pp. 780-783, 2003.
C. Arcelli and G. Sanniti di Baja, “Finding local maxima in a
pseudo-Euclidean distance transform,” Computer Graphics
Vision and Image Processing, vol. 43, pp. 361-367, 1988.
S. Svensson, I. Nyström, G. Borgefors, “On reversible
skeletonization using anchor points from distance
transforms,” Int. Journal of Visual Communication and
Image Representation, vol. 10, pp.379-397, 1999.
R.L. Ogniewicz and O. Kubler, “Hierarchic Voronoi
Skeletons,” Pattern Recognition, vol. 28, no. 3, pp. 343-359,
1995.
Y. Zhou and A. Toga, “Efficient Skeletonization of
Volumetric Objects,” IEEE Transactions on Visualization
and Computer Graphics, vol. 5, pp. 196-209, 1999.
T. Grogorishin, G. Abdel-Hamid, and Y.H. Yang,
“Skeletonization: An Electrostatic Field-Based Approach,”
Pattern Analysis and Application, vol. 1, no. 3, pp. 163-177,
1996.
B. Kegl and A. Krzyzak, “Piecewise linear skeletonization
using principal curves,” IEEE Transactions on Pattern
Analysis and Machine Intelligence, Vol.24, pp. 59-74, 2002.
B. Verwer, L. Van Vliet and P. Verbeek., “ Binary and GreyValue skeletons,” International Journal of Pattern
Recognition and Artificial Intelligence, vol. 7, pp. 1287-1308,
1993.
G. Levi and U. Montanari, “ A gray-weighted skeleton,”
Information and Control, vol. 17, pp. 62-91, 1970.
A. Dawoud, M. Kamel, “New Approach for the
Skeletonization of Handwritten Characters in Gray-Level
Images,” ICDAR, vol. 2, no. 2, pp. 1233, 2003.
A. Dawoud, M. Kamel, ”Natural Skeletonization: New
Approach For The Skeletonization Of Handwritten
Characters,”. Int. J. Image Graphics, vol. 5, no. 2, pp. 267280, 2005.
C. H. Teh and R.T. Chin, “On image analysis by the methods
of moments,” IEEE Trans. Pattern Anal. Machine Intell., vol.
10, pp. 496-512, 1988.
H.Q. Zhu, H.Z. Shu, J. Liang, L.M. Luo, J.L. Coatrieux,
Image analysis by discrete orthogonal Racah moments,
Signal Processing 87, pp. 687-708, 2007.
H. Qjidaa and L. Radouane, “Robust line fitting in a noisy
image by the method of moments,” IEEE Trans. Pattern.
Anal. Machine Intell., vol. 21, pp. 1216-1223, 1999.
H. El Fadili, K. Zenkouar and H. Qjidaa, “Lapped Block
Image Analysis Via the Method of Legendre Moments,”
EURASIP Journal on Applied Signal Processing, vol. 2003,
no. 9, pp. 902-913, 2003.
E. T. Jaynes, “On the rationale of maximum entropy
methods,” Proceedings of the IEEE, vol. 70, no. 9, Sept.
1982.
J. M. Van Campenout, and T. Cover, “Maximum entropy
and conditional probability,” IEEE Trans. on Information
theory, II, vol. 27, no. 4, Jul. 1988.
IJCSNS International Journal of Computer Science and Network Security, VOL.9 No.8, August 2009
[21] C. Chong, P. Raveendran and R. Mukundan, A comparative
[22]
[23]
[24]
[25]
[26]
[27]
[28]
[29]
analysis of algorithms for fast computation of Zernike
moments, Pattern Recognition 36 , pp. 731–742, 2003
Bharathi, V.S., Raghavan, V.S., Ganesan, L. Texture
classification using Zernike moments. In: Proc. 2nd FAE
Internat. Symposium, European University of Lefke, Turkey,
pp. 292–294, 2002.
A. Goshtasby, “Template matching in rotated images,” IEEE
Trans. Pattern Anal. Machine Intell., vol. PAMI-7, pp. 338–
344, May 1985.
F. Jurie and C. Schmid. Scale-invariant shape features for
recognition of object categories. In Computer Vision and
Pattern Recognition, Washington, DC, June-July 2004.
Markandey and R. J. P. Figueiredo, “Robot sensing
techniques based on high dimensional moment invariants
and tensors,” IEEE Trans. Robot. Automat., vol. 8, pp. 186–
195, Feb. 1992.
H. S. Hsu, “Moment preserving edge detection and its
application to image data compression,” Opt. Eng., vol. 32,
no. 7, pp. 1596–1608, 1993.
M. R. Teague, “Image analyis via the general theory of
moments,” J Opt. Soc. Amer., vol. 70, no. 8, pp. 920–930,
1980.
R. Mukundan and K. R. Ramakrishnan, “Fast computation of
Legendre and Zernike moments,” Pattern Recognit., vol. 28,
no. 9, pp. 1433–1442, Sept. 1995.
T.Y. Zhang and C.Y. Suen, “A fast parallel algorithm for
thinning patterns,” Comm. ACM, Vol.27, pp.236-239, 1984.
221