Résumé

Neighborhood Random Classification

Média

Voir la vidéo

Métriques

348
8
797.01 Ko
 application/pdf
bitcache://e4c71b3ee9a42cf16c37724d8ae7a43d73b9e490

Licence

Creative Commons Aucune (Tous droits réservés)

Sponsors

Sponsors scientifique

logo_smf_cmjn.gif

Sponsors financier

logo_gdr-mia.png
logo_inria.png
image010.png
logothales.jpg

Sponsors logistique

logo-minesparistech.jpg
logo-universite-paris-sud.jpg
logo_supelec.png
Séminaire Léon Brillouin Logo
logo_cnrs_2.jpg
logo_ircam.png
logo_imb.png
<resource  xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
                xmlns="http://datacite.org/schema/kernel-4"
                xsi:schemaLocation="http://datacite.org/schema/kernel-4 http://schema.datacite.org/meta/kernel-4/metadata.xsd">
        <identifier identifierType="DOI">10.23723/2552/9604</identifier><creators><creator><creatorName>Diala Ezzeddine</creatorName></creator></creators><titles>
            <title>Neighborhood Random Classification</title></titles>
        <publisher>SEE</publisher>
        <publicationYear>2014</publicationYear>
        <resourceType resourceTypeGeneral="Text">Text</resourceType><dates>
	    <date dateType="Created">Sat 18 Jan 2014</date>
	    <date dateType="Updated">Wed 31 Aug 2016</date>
            <date dateType="Submitted">Wed 19 Sep 2018</date>
	</dates>
        <alternateIdentifiers>
	    <alternateIdentifier alternateIdentifierType="bitstream">e4c71b3ee9a42cf16c37724d8ae7a43d73b9e490</alternateIdentifier>
	</alternateIdentifiers>
        <formats>
	    <format>application/pdf</format>
	</formats>
	<version>15080</version>
        <descriptions>
            <description descriptionType="Abstract"></description>
        </descriptions>
    </resource>
.

Neighborhood Random Classification Diala Ezzeddine Universit´e de Lyon (Lumi`ere Lyon 2) – Laboratoire ERIC – diala.ezzeddine@univ-lyon2.fr 29 August 2013 D. Ezzeddine (ERIC) Neighborhood Random Classification 29 August 2013 1 / 18 Introduction We propose to use neighborhood graphs in Ensemble Method EM Main purpose : Using neighborhood graphs is a strong alternative In this work, we : Used an EM classifier based on neighborhood, Random Neighborhood Classifier (RNC) Compared RNC to kNN, then to other methods D. Ezzeddine (ERIC) Neighborhood Random Classification 29 August 2013 2 / 18 Introduction Outline 1 Introduction 2 Basic Concepts Classifier Neighborhood Classifiers Partition by Neighborhood Graphs 3 Ensemble Methods 4 Neighborhood Random Classifier 5 Result 6 Conclusion D. Ezzeddine (ERIC) Neighborhood Random Classification 29 August 2013 3 / 18 Introduction Outline 1 Introduction 2 Basic Concepts Classifier Neighborhood Classifiers Partition by Neighborhood Graphs 3 Ensemble Methods 4 Neighborhood Random Classifier 5 Result 6 Conclusion D. Ezzeddine (ERIC) Neighborhood Random Classification 29 August 2013 3 / 18 Introduction Outline 1 Introduction 2 Basic Concepts Classifier Neighborhood Classifiers Partition by Neighborhood Graphs 3 Ensemble Methods 4 Neighborhood Random Classifier 5 Result 6 Conclusion D. Ezzeddine (ERIC) Neighborhood Random Classification 29 August 2013 3 / 18 Introduction Outline 1 Introduction 2 Basic Concepts Classifier Neighborhood Classifiers Partition by Neighborhood Graphs 3 Ensemble Methods 4 Neighborhood Random Classifier 5 Result 6 Conclusion D. Ezzeddine (ERIC) Neighborhood Random Classification 29 August 2013 3 / 18 Introduction Outline 1 Introduction 2 Basic Concepts Classifier Neighborhood Classifiers Partition by Neighborhood Graphs 3 Ensemble Methods 4 Neighborhood Random Classifier 5 Result 6 Conclusion D. Ezzeddine (ERIC) Neighborhood Random Classification 29 August 2013 3 / 18 Introduction Outline 1 Introduction 2 Basic Concepts Classifier Neighborhood Classifiers Partition by Neighborhood Graphs 3 Ensemble Methods 4 Neighborhood Random Classifier 5 Result 6 Conclusion D. Ezzeddine (ERIC) Neighborhood Random Classification 29 August 2013 3 / 18 Basic Concepts Classifier Classifier The goal of any machine learning algorithm is to build a classifier capable of predicting the membership class Y for any individual ! As is supposed to be a surrogate function of Y therefore (X(!)) ⇡ Y (!) must be satisfied for the largest number of ! 2 ⌦ D. Ezzeddine (ERIC) Neighborhood Random Classification 29 August 2013 4 / 18 Basic Concepts Neighborhood Classifiers Neighborhood Classifiers It is used to classify an individual based on its neighbors. Starting to find the neighborhood of this individual on El , Neighborhood Classifier depends on three components : Neighborhood set P : set of all possible neighbors The neighborhood function V : function links any points to its neighbors The decision rule C : probability distributions of the classes We define a neighborhood classifier as a combination of the triplet (P, V, C) : (X) = CV(X)(X) D. Ezzeddine (ERIC) Neighborhood Random Classification 29 August 2013 5 / 18 Basic Concepts Neighborhood Classifiers Examples of Neighborhood Structure Many types of neighborhoods can be used to build a classifier : K-nearst neighbors (kNN) ✏-neighbors PARZENs window neighbors Neighborhood regions like Decision Tree Neighbors with specific property like Gabriel Graph (GG), Relative Neighbors (RN) and Minimum Spaning Tree (MST) D. Ezzeddine (ERIC) Neighborhood Random Classification 29 August 2013 6 / 18 Basic Concepts Partition by Neighborhood Graphs Partition by Neighborhood Graphs The toy example, a two class data of 17 individuals : El X1 X2 Y El X1 X2 Y X1 2.13 2.33 2 X10 0 2.33 2 X2 2.13 4.11 2 X11 5.64 5.17 2 X3 2.22 1.76 2 X12 7.87 2.33 1 X4 3.37 6.88 1 X13 5.64 7.5 1 X5 6.77 0.67 1 X14 4.53 8.1 1 X6 4.53 1.16 1 X15 3.37 4.31 1 X7 3.37 0 1 X16 5.64 4.11 2 X8 1.8 6.47 2 X17 7.87 4.11 2 X9 0 5.77 2 0 1 2 3 4 5 6 7 8 −0.5 0.3 2 3 4 5 6 7 8 9 X1 X2 X3 X4 X5 X6 X7 X8 X9 X10 X11 X12 X13 X14 X15 X16 X17 D. Ezzeddine (ERIC) Neighborhood Random Classification 29 August 2013 7 / 18 Basic Concepts Partition by Neighborhood Graphs Neighborhood graph Examine the figure Apply the Gabriel Graph for example y 2 VGG (x) , 8z 2 El d(x, y)2  d2(x, z) + d2(z, y) Build the Gabriel Graph in El Erase the interclass edges 0 1 2 3 4 5 6 7 8 9 −0.5 0.3 2 3 4 5 6 7 8 9 X1 X2 X3 X4 X5 X6 X7 X8 X9 X10 X11 X12 X13 X14 X15 X16 X17 It is necessary now to find the class of a new point D. Ezzeddine (ERIC) Neighborhood Random Classification 29 August 2013 8 / 18 Basic Concepts Partition by Neighborhood Graphs Neighborhood graph Examine the figure Apply the Gabriel Graph for example y 2 VGG (x) , 8z 2 El d(x, y)2  d2(x, z) + d2(z, y) Build the Gabriel Graph in El Erase the interclass edges 0 1 2 3 4 5 6 7 8 9 −0.5 0.3 2 3 4 5 6 7 8 9 X1 X2 X3 X4 X5 X6 X7 X8 X9 X10 X11 X12 X13 X14 X15 X16 X17 It is necessary now to find the class of a new point D. Ezzeddine (ERIC) Neighborhood Random Classification 29 August 2013 8 / 18 Basic Concepts Partition by Neighborhood Graphs Neighborhood graph Examine the figure Apply the Gabriel Graph for example y 2 VGG (x) , 8z 2 El d(x, y)2  d2(x, z) + d2(z, y) Build the Gabriel Graph in El Erase the interclass edges 0 1 2 3 4 5 6 7 8 9 −0.5 0.3 2 3 4 5 6 7 8 9 X1 X2 X3 X4 X5 X6 X7 X8 X9 X10 X11 X12 X13 X14 X15 X16 X17 It is necessary now to find the class of a new point D. Ezzeddine (ERIC) Neighborhood Random Classification 29 August 2013 8 / 18 Basic Concepts Partition by Neighborhood Graphs Neighborhood graph Examine the figure Apply the Gabriel Graph for example y 2 VGG (x) , 8z 2 El d(x, y)2  d2(x, z) + d2(z, y) Build the Gabriel Graph in El Erase the interclass edges 0 1 2 3 4 5 6 7 8 9 −0.5 0.3 2 3 4 5 6 7 8 9 X1 X2 X3 X4 X5 X6 X7 X8 X9 X10 X11 X12 X13 X14 X15 X16 X17 It is necessary now to find the class of a new point D. Ezzeddine (ERIC) Neighborhood Random Classification 29 August 2013 8 / 18 Basic Concepts Partition by Neighborhood Graphs Neighborhood graph Examine the figure Apply the Gabriel Graph for example y 2 VGG (x) , 8z 2 El d(x, y)2  d2(x, z) + d2(z, y) Build the Gabriel Graph in El Erase the interclass edges 0 1 2 3 4 5 6 7 8 9 −0.5 0.3 2 3 4 5 6 7 8 9 X1 X2 X3 X4 X5 X6 X7 X8 X9 X10 X11 X12 X13 X14 X15 X16 X17 It is necessary now to find the class of a new point D. Ezzeddine (ERIC) Neighborhood Random Classification 29 August 2013 8 / 18 Basic Concepts Partition by Neighborhood Graphs Neighborhood graph Examine the figure Apply the Gabriel Graph for example y 2 VGG (x) , 8z 2 El d(x, y)2  d2(x, z) + d2(z, y) Build the Gabriel Graph in El Erase the interclass edges 0 1 2 3 4 5 6 7 8 9 −0.5 0.3 2 3 4 5 6 7 8 9 X1 X2 X3 X4 X5 X6 X7 X8 X9 X10 X11 X12 X13 X14 X15 X16 X17 It is necessary now to find the class of a new point D. Ezzeddine (ERIC) Neighborhood Random Classification 29 August 2013 8 / 18 Basic Concepts Partition by Neighborhood Graphs A↵ectation A↵ectation when it is necessary to classify a new point try to find the neighbors of this new point in P for example the decision rule C is the nearst node in Gabriel Graph the point a↵ected to the zone with the nearst neighbor 0 1 2 3 4 5 6 7 8 9 −0.5 0.3 2 3 4 5 6 7 8 9 X1 X2 X3 X4 X5 X6 X7 X8 X9 X10 X11 X12 X13 X14 X15 X16 X17 D. Ezzeddine (ERIC) Neighborhood Random Classification 29 August 2013 9 / 18 Basic Concepts Partition by Neighborhood Graphs A↵ectation A↵ectation when it is necessary to classify a new point try to find the neighbors of this new point in P for example the decision rule C is the nearst node in Gabriel Graph the point a↵ected to the zone with the nearst neighbor 0 1 2 3 4 5 6 7 8 9 −0.5 0.3 2 3 4 5 6 7 8 9 X1 X2 X3 X4 X5 X6 X7 X8 X9 X10 X11 X12 X13 X14 X15 X16 X17 Xnew D. Ezzeddine (ERIC) Neighborhood Random Classification 29 August 2013 9 / 18 Basic Concepts Partition by Neighborhood Graphs A↵ectation A↵ectation when it is necessary to classify a new point try to find the neighbors of this new point in P for example the decision rule C is the nearst node in Gabriel Graph the point a↵ected to the zone with the nearst neighbor 0 1 2 3 4 5 6 7 8 9 −0.5 0.3 2 3 4 5 6 7 8 9 X1 X2 X3 X4 X5 X6 X7 X8 X9 X10 X11 X12 X13 X14 X15 X16 X17 Xnew D. Ezzeddine (ERIC) Neighborhood Random Classification 29 August 2013 9 / 18 Basic Concepts Partition by Neighborhood Graphs A↵ectation A↵ectation when it is necessary to classify a new point try to find the neighbors of this new point in P for example the decision rule C is the nearst node in Gabriel Graph the point a↵ected to the zone with the nearst neighbor 0 1 2 3 4 5 6 7 8 9 −0.5 0.3 2 3 4 5 6 7 8 9 X1 X2 X3 X4 X5 X6 X7 X8 X9 X10 X11 X12 X13 X14 X15 X16 X17 Xnew D. Ezzeddine (ERIC) Neighborhood Random Classification 29 August 2013 9 / 18 Basic Concepts Partition by Neighborhood Graphs A↵ectation A↵ectation when it is necessary to classify a new point try to find the neighbors of this new point in P for example the decision rule C is the nearst node in Gabriel Graph the point a↵ected to the zone with the nearst neighbor 0 1 2 3 4 5 6 7 8 9 −0.5 0.3 2 3 4 5 6 7 8 9 X1 X2 X3 X4 X5 X6 X7 X8 X9 X10 X11 X12 X13 X14 X15 X16 X17 XnewXnew D. Ezzeddine (ERIC) Neighborhood Random Classification 29 August 2013 9 / 18 Basic Concepts Partition by Neighborhood Graphs Graphs Examples 1 2 3 4 5 6 7 1 2 3 4 5 6 7 X1 X2 X3 X4 X5 X6 X7 X8 X9 X10 X11 X12 X13 X14 X15 X16 X17 Lunula Figure: Graph of relative neighbours 1 2 3 4 5 6 7 1 2 3 4 5 6 7 X1 X2 X3 X4 X5 X6 X7 X8 X9 X10 X11 X12 X13 X14 X15 X16 X17 Figure: Gabriel graph 1 2 3 4 5 6 7 1 2 3 4 5 6 7 ω1 ω2 ω3 ω4 ω5 ω6 ω7 ω8 ω9 ω10 ω11 ω12 ω13 ω14 ω15 ω16 ω17 Figure: Minimum Spanning Tree 1 2 3 4 5 6 7 1 2 3 4 5 6 7 ω1 ω2 ω3 ω4 ω5 ω6 ω7 ω8 ω9 ω10 ω11 ω12 ω13 ω14 ω15 ω16 ω17 Figure: K nearest Neighbours (k=3) D. Ezzeddine (ERIC) Neighborhood Random Classification 29 August 2013 10 / 18 Ensemble Methods Ensemble Methods (EMs) Definition These Methods generate a set of classifiers using one or several Machine Learning Algorithms and then aggregate them into a single classifier. EMs Strength EMs, by aggregating di↵erent and independent classifiers, reduce the bias and the variance of the Meta-classifier. Many works [BWHY05, Bre01, Sch03, ZWT02] showed that a set of classifiers give a better prediction than the best among them. popularity The most popular methods like Decision Tree, SVM and k-Nearest Neighbors kNN are used in EMs. Many application fields like physics, face recognition, ecology... used EMs. D. Ezzeddine (ERIC) Neighborhood Random Classification 29 August 2013 11 / 18 Ensemble Methods EM Procedure Based on one Learning Method using a set ; build m sets Ei l ; build m classifiers i ; Apply these classifiers for new point to find c1, c2, . . . , cm Agregate the results El D. Ezzeddine (ERIC) Neighborhood Random Classification 29 August 2013 12 / 18 Ensemble Methods EM Procedure Based on one Learning Method using a set ; build m sets Ei l ; build m classifiers i ; Apply these classifiers for new point to find c1, c2, . . . , cm Agregate the results El E1 l E2 l Em l D. Ezzeddine (ERIC) Neighborhood Random Classification 29 August 2013 12 / 18 Ensemble Methods EM Procedure Based on one Learning Method using a set ; build m sets Ei l ; build m classifiers i ; Apply these classifiers for new point to find c1, c2, . . . , cm Agregate the results El E1 l E2 l Em l φ1 φ2 φm D. Ezzeddine (ERIC) Neighborhood Random Classification 29 August 2013 12 / 18 Ensemble Methods EM Procedure Based on one Learning Method using a set ; build m sets Ei l ; build m classifiers i ; Apply these classifiers for new point to find c1, c2, . . . , cm Agregate the results El E1 l E2 l Em l φ1 φ2 φm c1 c2 cm D. Ezzeddine (ERIC) Neighborhood Random Classification 29 August 2013 12 / 18 Ensemble Methods EM Procedure Based on one Learning Method using a set ; build m sets Ei l ; build m classifiers i ; Apply these classifiers for new point to find c1, c2, . . . , cm Agregate the results El E1 l E2 l Em l φ1 φ2 φm c1 c2 cm c D. Ezzeddine (ERIC) Neighborhood Random Classification 29 August 2013 12 / 18 Neighborhood Random Classifier Neighborhood Random Classifier (RNC) RNC 1 generates new learning sets E1 l ...Em l by random sampling 2 generates new classifiers 1... m based on neighborhood graphs for each learning set 3 uses these generated classifiers to find estimates c1...cm Then EM aggregate these estimates c1...cm into one classification D. Ezzeddine (ERIC) Neighborhood Random Classification 29 August 2013 13 / 18 Neighborhood Random Classifier RNC depend on 2 variables 1 the sampling procedure for generating the M classifiers 2 the aggregation procedure for combining the M predictions We tested many kinds of sampling like sampling on rows with or without replacement, sampling on columns...and several aggregation functions like majority, naive bayes, decision template... D. Ezzeddine (ERIC) Neighborhood Random Classification 29 August 2013 14 / 18 Result Verification We applied the RNC based on RNGs, GGs and MSTs graphs We compared these methods to kNN with k=1,3,10 and to KSVM and Random Forest (RF) We computed the error rate of all these methods by testing 16 quantitative data sets from UCI : I We used the mean and the decision template (DT) as aggregation methods for the RNC I We applied 10 CV to obtain the estimation of the error rate I We Ranked all the methods D. Ezzeddine (ERIC) Neighborhood Random Classification 29 August 2013 15 / 18 Result Verification Glass Sonar Arcene Ecoli Ionosphere iris Letter Magic (RvsB) Gamma Telescope Err Rank Err Rank Err Rank Err Rank Err Rank Err Rank Err Rank Err Rank RNG.DT 0.100 6 0.185 3 0.139 7 0.243 6 0.057 5.5 0.047 5 0.009 3.5 0.150 3 RNG.mean 0.105 7 0.200 5 0.155 9.5 0.243 6 0.049 1 0.060 13 0.009 3.5 0.167 9 GG.DT 0.205 13 0.225 10 0.133 6.5 0.243 6 0.074 12 0.053 9 0.039 13 0.157 5.5 GG.mean 0.215 14 0.260 12 0.176 14 0.290 11 0.069 9.5 0.060 13 0.040 14 0.181 11 MST.DT 0.085 2 0.195 4 0.127 2.5 0.243 6 0.051 2.5 0.053 9 0.009 3.5 0.153 4 MST.mean 0.095 5 0.210 7 0.155 9.5 0.238 3 0.057 5.5 0.060 13 0.009 3.5 0.175 10 KNN1.DT 0.080 1 0.180 2 0.152 8 0.243 6 0.054 4 0.053 9 0.009 3.5 0.162 7 KNN1.mean 0.090 3.5 0.215 8 0.167 12 0.233 2 0.071 11 0.053 9 0.009 3.5 0.184 12 KNN3.DT 0.090 3.5 0.205 6 0.127 2.5 0.252 10 0.063 7 0.040 2 0.012 7.5 0.157 5.5 KNN3.mean 0.110 8 0.230 11 0.161 11 0.248 9 0.083 13 0.053 9 0.012 7.5 0.186 13 KNN10.DT 0.140 9 0.285 14 0.133 5.5 0.314 12 0.066 8 0.047 5 0.018 9.5 0.164 8 KNN10.mean 0.155 10 0.275 13 0.173 13 0.319 13.5 0.126 14 0.040 2 0.020 11 0.205 14 Rand. For. 0.165 11 0.165 1 0.127 2.5 0.219 1 0.069 9.5 0.047 5 0.022 12 0.140 1 KSVM 0.170 12 0.220 9 0.127 2.5 0.319 13.5 0.051 2.5 0.040 2 0.018 9.5 0.144 2 Parkinsons Diabete Planning Ringnorm spam Threemorm Twonorm Winsc. (Pima) relax base breast cancer Err Rank Err Rank Err Rank Err Rank Err Rank Err Rank Err Rank Err Rank RNG.DT 0.047 4.5 0.249 6 0.444 11 0.019 3 0.062 2 0.134 3 0.029 6.5 0.026 3.5 RNG.mean 0.063 8 0.250 7 0.300 8 0.036 6 0.072 5 0.136 4 0.029 6.5 0.028 6 GG.DT 0.121 12.5 0.241 2.5 0.400 9 0.015 1 0.066 3 0.140 7 0.025 1.5 0.029 9 GG.mean 0.174 14 0.241 2.5 0.283 2.5 0.031 4 0.076 6 0.144 10.5 0.026 4 0.029 9 MST.DT 0.053 6 0.254 10.5 0.456 12.5 0.032 5 0.138 10 0.148 12 0.032 10 0.026 3.5 MST.mean 0.047 4.5 0.258 12 0.283 2.5 0.067 11 0.230 14 0.144 10.5 0.033 11 0.026 3.5 KNN1.DT 0.032 2 0.261 13 0.472 14 0.046 8 0.137 9 0.156 14 0.034 12.5 0.029 9 KNN1.mean 0.037 3 0.264 14 0.294 6 0.166 12 0.228 13 0.153 13 0.035 14 0.029 9 KNN3.DT 0.021 1 0.254 10.5 0.456 12.5 0.045 7 0.113 8 0.139 6 0.030 8.5 0.026 3.5 KNN3.mean 0.058 7 0.253 9 0.294 6 0.221 13 0.213 12 0.142 8 0.030 8.5 0.032 12.5 KNN10.DT 0.074 9 0.251 8 0.433 10 0.047 9 0.086 7 0.133 2 0.026 4 0.029 9 KNN10.mean 0.121 12.5 0.246 4 0.283 2.5 0.335 14 0.191 11 0.143 9 0.025 1.5 0.032 12.5 Rand. For. 0.084 10 0.232 1 0.294 6 0.050 10 0.046 1 0.138 5 0.034 12.5 0.024 1 KSVM 0.111 11 0.247 5 0.283 2.5 0.017 2 0.067 4 0.130 1 0.026 4 0.037 14 D. Ezzeddine (ERIC) Neighborhood Random Classification 29 August 2013 16 / 18 Result Verification RNG GG MST KNN1 KNN3 KNN10 RF KSVM DT mean DT mean DT mean DT mean DT mean DT mean All methods 4.84 6.53 7.53 9.44 6.44 7.84 7.62 9.06 6.31 9.84 8.06 9.84 5.59 6.03 rank 1 6 7 12 5 9 8 11 4 13.5 10 13.5 2 3 Table: Mean rank of all methods RNG GG MST KNN1 KNN3 KNN10 Only graphs 2.47 3.81 2.94 3.59 4.03 4.22 rank 1 4 2 3 5 6 Table: Mean rank of graph methods D. Ezzeddine (ERIC) Neighborhood Random Classification 29 August 2013 17 / 18 Conclusion Conclusion 1 We provided a new approach for using neighborhood structures in EM 2 Results show that our method (RNC) is a strong alternative to the most powerful techniques like RF and KSVM 3 Methods based on geometrical neighborhood graphs outperform the classic methods such as kNN Further work Many possible improvements of RNC can be made like : the choice of the dissimilarity measure between individuals in the graphs determine the membership class of an unclassified individual ... D. Ezzeddine (ERIC) Neighborhood Random Classification 29 August 2013 18 / 18 Conclusion [Bre01] L. Breiman. Random forests. Machine learning, 45(1) :5–32, 2001. [BWHY05] G. Brown, J. Wyatt, R. Harris, and X. Yao. Diversity creation methods : a survey and categorisation. Information Fusion, 6(1) :5–20, 2005. [Sch03] R.E. Schapire. The boosting approach to machine learning : An overview. Lecture Notes In Statistics-Springer Verlag, pages 149–172, 2003. [ZWT02] Z.H. Zhou, J. Wu, and W. Tang. Ensembling neural networks : Many could be better than all* 1. Artificial intelligence, 137(1-2) :239–263, 2002. D. Ezzeddine (ERIC) Neighborhood Random Classification 29 August 2013 18 / 18