{ "data_id": "99", "name": "texture", "exact_name": "texture", "version": 1, "version_label": "1", "description": "**Author**: Laboratory of Image Processing and Pattern Recognition (INPG-LTIRF), Grenoble - France. \r\n**Source**: [original](https:\/\/www.elen.ucl.ac.be\/neural-nets\/Research\/Projects\/ELENA\/databases\/REAL\/texture\/) - ELENA project \r\n**Please cite**: \r\n\r\n####1. Summary\r\n\r\nThis database was generated by the Laboratory of Image Processing and Pattern Recognition (INPG-LTIRF) in the development of the Esprit project ELENA No. 6891 and the Esprit working group ATHOS No. 6620.\r\n```\r\n (a) Original source:\r\n\r\n P. Brodatz \"Textures: A Photographic Album for Artists and Designers\",\r\n Dover Publications,Inc.,New York, 1966.\r\n\r\n (b) Creation: Laboratory of Image Processing and Pattern Recognition\r\n\r\n Institut National Polytechnique de Grenoble INPG\r\n Laboratoire de Traitement d'Image et de Reconnaissance de Formes LTIRF\r\n Av. Felix Viallet, 46\r\n F-38031 Grenoble Cedex\r\n France\r\n\r\n (c) Contact: Dr. A. Guerin-Dugue, INPG-LTIRF, guerin@tirf.inpg.fr\r\n```\r\n\r\n####2. Past Usage:\r\n\r\nThis database has a private usage at the TIRF laboratory. It has been created in order to study the textures discrimination with high order statistics.\r\n\r\n```\r\nA.Guerin-Dugue, C. Aviles-Cruz, \"High Order Statistics from Natural Textured Images\",\r\nIn ATHOS workshop on System Identification and High Order Statistics, Sophia-Antipolis, France, September 1993.\r\n\r\nGuerin-Dugue, A. and others, Deliverable R3-B4-P - Task B4: Benchmarks, Technical report,\r\nElena-NervesII \"Enhanced Learning for Evolutive Neural Architecture\", ESPRIT-Basic Research Project Number 6891,\r\nJune 1995.\r\n```\r\n\r\n####3. Relevant Information:\r\n\r\nThe aim is to distinguish between 11 different textures (Grass lawn, Pressed calf leather, Handmade paper, Raffia looped to a high pile, Cotton canvas, ...), each pattern (pixel) being characterised by 40 attributes built by the estimation of fourth order modified moments in four orientations: 0, 45, 90 and 135 degrees.\r\n\r\nA statistical method based on the extraction of fourth order moments for the characterization of natural micro-textures was developed called \"fourth order modified moments\" (mm4) [Guerin93], this method measures the deviation from first-order Gauss-Markov process, for each texture. The features were estimated in four directions to take into account the possible orientations of the textures (0, 45, 90 and 135 degrees). Only correlation between the current pixel, the first neighbourhood and the second neighbourhood are taken into account. This small neighbourhood is adapted to the fine grain property of the textures.\r\n\r\nThe data set contains 11 classes of 500 instances and each class refers to a type of texture in the Brodatz album.\r\n\r\nThe database dimension is 40 plus one for the class label. The 40 attributes were build respectively by the estimation of the following fourth order modified moments in four orientations: 0, 45, 90 and 135 degrees: mm4(000), mm4(001), mm4(002), mm4(011), mm4(012), mm4(022), mm4(111), mm4(112), mm4(122) and mm4(222).\r\n\r\n!! Patterns are always sorted by class and are presented in the increasing order of their class label in each dataset relative to the texture database (texture.dat, texture_CR.dat, texture_PCA.dat, texture_DFA.dat)\r\n\r\n####4. Class:\r\n\r\nThe class label is a code for the following classes:\r\n```\r\n Class Class label\r\n 2 Grass lawn (D09) \r\n 3 Pressed calf leather (D24) \r\n 4 Handmade paper (D57) \r\n 6 Raffia looped to a high pile: (D84) \r\n 7 Cotton canvas (D77) \r\n 8 Pigskin (D92) \r\n 9 Beach sand: (D28) \r\n 10 Beach sand (D29) \r\n 12 Oriental straw cloth (D53) \r\n 13 Oriental straw cloth (D78) \r\n 14 Oriental grass fiber cloth (D79) \r\n```\r\n\r\n####5. Summary Statistics:\r\n\r\nTable here below provides for each attribute of the database the dynamic (Min and Max values), the mean value and the standard deviation.\r\n\r\n```\r\nAttribute Min Max Mean Standard \r\n deviation \r\n\r\n 1 -1.4495 0.7741 -1.0983 0.2034\r\n 2 -1.2004 0.3297 -0.5867 0.2055\r\n 3 -1.3099 0.3441 -0.5838 0.3135\r\n 4 -1.1104 0.5878 -0.4046 0.2302\r\n 5 -1.0534 0.4387 -0.3307 0.2360\r\n 6 -1.0029 0.4515 -0.2422 0.2225\r\n 7 -1.2076 0.5246 -0.6026 0.2003\r\n 8 -1.0799 0.3980 -0.4322 0.2210\r\n 9 -1.0570 0.4369 -0.3317 0.2361\r\n 10 -1.2580 0.3546 -0.5978 0.3268\r\n 11 -1.4495 0.7741 -1.0983 0.2034\r\n 12 -1.0831 0.3715 -0.5929 0.2056\r\n 13 -1.1194 0.6347 -0.4019 0.3368\r\n 14 -1.0182 0.1573 -0.6270 0.1390\r\n 15 -0.9435 0.1642 -0.4482 0.1952\r\n 16 -0.9944 0.0357 -0.5763 0.1587\r\n 17 -1.1722 0.0201 -0.7331 0.1955\r\n 18 -1.0174 0.1155 -0.4919 0.2335\r\n 19 -1.0044 0.0833 -0.4727 0.2257\r\n 20 -1.1800 0.4392 -0.4831 0.3484\r\n 21 -1.4495 0.7741 -1.0983 0.2034\r\n 22 -1.2275 0.5963 -0.7363 0.2220\r\n 23 -1.3412 0.4464 -0.7771 0.3290\r\n 24 -1.1774 0.6882 -0.5770 0.2646\r\n 25 -1.1369 0.4098 -0.5085 0.2538\r\n 26 -1.1099 0.3725 -0.4038 0.2515\r\n 27 -1.2393 0.6120 -0.7279 0.2278\r\n 28 -1.1540 0.4221 -0.5863 0.2446\r\n 29 -1.1323 0.3916 -0.5090 0.2526\r\n 30 -1.4224 0.4718 -0.7708 0.3264\r\n 31 -1.4495 0.7741 -1.0983 0.2034\r\n 32 -1.1789 0.5647 -0.6463 0.1890\r\n 33 -1.1473 0.6755 -0.4919 0.3304\r\n 34 -1.1228 0.3132 -0.6435 0.1441\r\n 35 -1.0145 0.3396 -0.4918 0.1922\r\n 36 -1.0298 0.1560 -0.5934 0.1704\r\n 37 -1.2534 0.0899 -0.7795 0.1641\r\n 38 -1.0966 0.1944 -0.5541 0.2111\r\n 39 -1.0765 0.2019 -0.5230 0.2015\r\n 40 -1.2155 0.4647 -0.5677 0.3091\r\n```\r\n\r\nThe dynamic of the attributes is in [-1.45 - 0.775]. The database resulting from the centering and reduction by attribute of the Texture database is on the ftp server in the `REAL\/texture\/texture_CR.dat.Z' file.\r\n\r\n####6. Confusion matrix.\r\n\r\nThe following confusion matrix of the k_NN classifier was obtained with a Leave_One_Out error counting method on the texture_CR.dat database. k was set to 1 in order to reach the minimum mean error rate : 1.0 +\/- 0.8%.\r\n\r\n```\r\n Class 2 3 4 6 7 8 9 10 12 13 14 \r\n 2 97.0 1.0 0.4 0.0 0.0 0.0 1.6 0.0 0.0 0.0 0.0 \r\n 3 0.2 99.0 0.0 0.0 0.0 0.0 0.4 0.0 0.0 0.0 0.4 \r\n 4 1.0 0.0 98.8 0.0 0.0 0.0 0.2 0.0 0.0 0.0 0.0 \r\n 6 0.0 0.0 0.0 99.4 0.0 0.0 0.0 0.6 0.0 0.0 0.0 \r\n 7 0.0 0.0 0.0 0.0 100.0 0.0 0.0 0.0 0.0 0.0 0.0 \r\n 8 0.0 0.0 0.0 0.0 0.0 98.6 0.0 1.4 0.0 0.0 0.0 \r\n 9 0.4 0.0 0.2 0.0 0.0 0.2 98.8 0.4 0.0 0.0 0.0 \r\n 10 0.0 0.0 0.0 0.0 0.0 1.4 0.0 98.6 0.0 0.0 0.0 \r\n 12 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 100.0 0.0 0.0 \r\n 13 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 99.8 0.2 \r\n 14 0.0 0.4 0.0 0.0 0.0 0.4 0.0 0.0 0.2 0.0 99.0 \r\n```\r\n\r\n7. Result of the Principal Component Analysis:\r\n\r\nThe Principal Components Analysis is a very classical method in pattern recognition [Duda73]. PCA reduces the sample dimension in a linear way for the best representation in lower dimensions keeping the maximum of inertia. The best axe for the representation is however not necessary the best axe for the discrimination. After PCA, features are selected according to the percentage of initial inertia which is covered by the different axes and the number of features is determined according to the percentage of initial inertia to keep for the classification process.\r\n\r\nThis selection method has been applied on the texture_CR database. When quasi-linear correlations exists between some initial features, these redundant dimensions are removed by PCA and this preprocessing is then recommended. In this case, before a PCA, the determinant of the data covariance matrix is near zero; this database is thus badly conditioned for all process which use this information (the quadratic classifier for example).\r\n\r\nThe following file is available for the texture database: ''texture_PCA.dat.Z'', it is the projection of the ''texture_CR'' database on its principal components (sorted in a decreasing order of the related inertia percentage; so, if you desire to work on the database projected on its x first principal components you only have to keep the x first attributes of the texture_PCA.dat database and the class labels (last attribute)).\r\n\r\nTable here below provides the inertia percentages associated to the eigenvalues corresponding to the principal component axis sorted in the decreasing order of the associated inertia percentage. 99.85 percent of the total database inertia will remain if the 20 first principal components are kept.\r\n\r\n```\r\n Eigen Value Inertia Cumulated\r\n value percentage inertia\r\n\r\n 1 30.267500000 75.6687000000 75.6687000000 \r\n 2 3.6512500000 9.1281300000 84.7969000000 \r\n 3 2.2937000000 5.7342400000 90.5311000000 \r\n 4 1.7039700000 4.2599300000 94.7910000000 \r\n 5 0.6716540000 1.6791300000 96.4702000000 \r\n 6 0.5015290000 1.2538200000 97.7240000000 \r\n 7 0.1922830000 0.4807070000 98.2047000000 \r\n 8 0.1561070000 0.3902670000 98.5950000000 \r\n 9 0.1099570000 0.2748920000 98.8699000000 \r\n 10 0.0890891000 0.2227230000 99.0926000000 \r\n 11 0.0656016000 0.1640040000 99.2566000000 \r\n 12 0.0489988000 0.1224970000 99.3791000000 \r\n 13 0.0433819000 0.1084550000 99.4875000000 \r\n 14 0.0345022000 0.0862554000 99.5738000000 \r\n 15 0.0299203000 0.0748007000 99.6486000000 \r\n 16 0.0248857000 0.0622141000 99.7108000000 \r\n 17 0.0167901000 0.0419752000 99.7528000000 \r\n 18 0.0161633000 0.0404083000 99.7932000000 \r\n 19 0.0128898000 0.0322246000 99.8254000000 \r\n 20 0.0113884000 0.0284710000 99.8539000000 \r\n 21 0.0078481400 0.0196204000 99.8735000000 \r\n 22 0.0071527800 0.0178820000 99.8914000000 \r\n 23 0.0067661400 0.0169153000 99.9083000000 \r\n 24 0.0053149500 0.0132874000 99.9216000000 \r\n 25 0.0051102600 0.0127757000 99.9344000000 \r\n 26 0.0047116600 0.0117792000 99.9461000000 \r\n 27 0.0036193700 0.0090484300 99.9552000000 \r\n 28 0.0033222000 0.0083054900 99.9635000000 \r\n 29 0.0030722400 0.0076806100 99.9712000000 \r\n 30 0.0026373300 0.0065933300 99.9778000000 \r\n 31 0.0020996800 0.0052492000 99.9830000000 \r\n 32 0.0019376500 0.0048441200 99.9879000000 \r\n 33 0.0015642300 0.0039105700 99.9918000000 \r\n 34 0.0009679080 0.0024197700 99.9942000000 \r\n 35 0.0009578000 0.0023945000 99.9966000000 \r\n 36 0.0007379780 0.0018449400 99.9984000000 \r\n 37 0.0006280250 0.0015700600 100.000000000\r\n 38 0.0000000040 0.0000000099 100.000000000 \r\n 39 0.0000000001 0.0000000003 100.000000000 \r\n 40 0.0000000008 0.0000000019 100.000000000 \r\n\r\n```\r\n\r\nThis matrix can be found in the texture_EV.dat file.\r\n\r\nThe Discriminant Factorial Analysis (DFA) can be applied to a learning database where each learning sample belongs to a particular class [Duda73]. The number of discriminant features selected by DFA is fixed in function of the number of classes (c) and of the number of input dimensions (d); this number is equal to the minimum between d and c-1. In the usual case where d is greater than c, the output dimension is fixed equal to the number of classes minus one and the discriminant axes are selected in order to maximize the between-variance and to minimize the within-variance of the classes.\r\n\r\nThe discrimination power (ratio of the projected between-variance over the projected within-variance) is not the same for each discriminant axis: this ratio decreases for each axis. So for a problem with many classes, this preprocessing will not be always efficient as the last output features will not be so discriminant. This analysis uses the information of the inverse of the global covariance matrix, so the covariance matrix must be well conditioned (for example, a preliminary PCA must be applied to remove the linearly correlated dimensions).\r\n\r\nThe Discriminant Factorial Analysis (DFA) has been applied on the 18 first principal components of the texture_PCA database (thus by keeping only the 18 first attributes of these databases before to apply the DFA preprocessing) in order to build the texture_DFA.dat.Z database file, having 10 dimensions (the texture database having 11 classes). In the case of the texture database, experiments shown that a DFA preprocessing is very useful and most of the time improved the classifiers performances.\r\n\r\n[Duda73] Duda, R.O. and Hart, P.E.,Pattern Classification and Scene Analysis, John Wiley & Sons, 1973.\r\n", "format": "ARFF", "uploader": "Rafael Gomes Mantovani", "uploader_id": 64, "visibility": "public", "creator": null, "contributor": null, "date": "2016-07-29 21:03:14", "update_comment": null, "last_update": "2016-07-29 21:03:14", "licence": "Public", "status": "active", "error_message": null, "url": "https:\/\/www.openml.org\/data\/download\/4535764\/phpBDgUyY", "default_target_attribute": "Class", "row_id_attribute": null, "ignore_attribute": null, "runs": 0, "suggest": { "input": [ "texture", "####1. Summary This database was generated by the Laboratory of Image Processing and Pattern Recognition (INPG-LTIRF) in the development of the Esprit project ELENA No. 6891 and the Esprit working group ATHOS No. 6620. ``` (a) Original source: P. Brodatz \"Textures: A Photographic Album for Artists and Designers\", Dover Publications,Inc.,New York, 1966. (b) Creation: Laboratory of Image Processing and Pattern Recognition Institut National Polytechnique de Grenoble INPG Laboratoire de Traitement d " ], "weight": 5 }, "qualities": { "NumberOfInstances": 5500, "NumberOfFeatures": 41, "NumberOfClasses": 11, "NumberOfMissingValues": 0, "NumberOfInstancesWithMissingValues": 0, "NumberOfNumericFeatures": 40, "NumberOfSymbolicFeatures": 1, "AutoCorrelation": 0.9981814875431897, "CfsSubsetEval_DecisionStumpAUC": 0.9625286727272728, "CfsSubsetEval_DecisionStumpErrRate": 0.08872727272727272, "CfsSubsetEval_DecisionStumpKappa": 0.9024, "CfsSubsetEval_NaiveBayesAUC": 0.9625286727272728, "CfsSubsetEval_NaiveBayesErrRate": 0.08872727272727272, "CfsSubsetEval_NaiveBayesKappa": 0.9024, "CfsSubsetEval_kNN1NAUC": 0.9625286727272728, "CfsSubsetEval_kNN1NErrRate": 0.08872727272727272, "CfsSubsetEval_kNN1NKappa": 0.9024, "ClassEntropy": 3.459431618637298, "DecisionStumpAUC": 0.7398999272727274, "DecisionStumpErrRate": 0.8185454545454546, "DecisionStumpKappa": 0.0996, "Dimensionality": 0.007454545454545454, "EquivalentNumberOfAtts": null, "J48.00001.AUC": 0.9627766000000001, "J48.00001.ErrRate": 0.08181818181818182, "J48.00001.Kappa": 0.91, "J48.0001.AUC": 0.9627766000000001, "J48.0001.ErrRate": 0.08181818181818182, "J48.0001.Kappa": 0.91, "J48.001.AUC": 0.9627766000000001, "J48.001.ErrRate": 0.08181818181818182, "J48.001.Kappa": 0.91, "MajorityClassPercentage": 9.090909090909092, "MajorityClassSize": 500, "MaxAttributeEntropy": null, "MaxKurtosisOfNumericAtts": 11.448611212294171, "MaxMeansOfNumericAtts": -0.24215909090909102, "MaxMutualInformation": null, "MaxNominalAttDistinctValues": 11, "MaxSkewnessOfNumericAtts": 2.6688506796156806, "MaxStdDevOfNumericAtts": 0.3483659144675927, "MeanAttributeEntropy": null, "MeanKurtosisOfNumericAtts": 1.1981350593397597, "MeanMeansOfNumericAtts": -0.6055166227272727, "MeanMutualInformation": null, "MeanNoiseToSignalRatio": null, "MeanNominalAttDistinctValues": 11, "MeanSkewnessOfNumericAtts": 0.2853136375729942, "MeanStdDevOfNumericAtts": 0.23320130174567713, "MinAttributeEntropy": null, "MinKurtosisOfNumericAtts": -1.080511792253421, "MinMeansOfNumericAtts": -1.0983054545454545, "MinMutualInformation": null, "MinNominalAttDistinctValues": 11, "MinSkewnessOfNumericAtts": -1.140240981713934, "MinStdDevOfNumericAtts": 0.13897688217291798, "MinorityClassPercentage": 9.090909090909092, "MinorityClassSize": 500, "NaiveBayesAUC": 0.9684679636363634, "NaiveBayesErrRate": 0.22618181818181818, "NaiveBayesKappa": 0.7512, "NumberOfBinaryFeatures": 0, "PercentageOfBinaryFeatures": 0, "PercentageOfInstancesWithMissingValues": 0, "PercentageOfMissingValues": 0, "PercentageOfNumericFeatures": 97.5609756097561, "PercentageOfSymbolicFeatures": 2.4390243902439024, "Quartile1AttributeEntropy": null, "Quartile1KurtosisOfNumericAtts": -0.6186434590446399, "Quartile1MeansOfNumericAtts": -0.7075200454545454, "Quartile1MutualInformation": null, "Quartile1SkewnessOfNumericAtts": -0.2864494969993817, "Quartile1StdDevOfNumericAtts": 0.20057235692977396, "Quartile2AttributeEntropy": null, "Quartile2KurtosisOfNumericAtts": 0.030664066794594103, "Quartile2MeansOfNumericAtts": -0.5804279999999999, "Quartile2MutualInformation": null, "Quartile2SkewnessOfNumericAtts": 0.027156539944742052, "Quartile2StdDevOfNumericAtts": 0.222246727329651, "Quartile3AttributeEntropy": null, "Quartile3KurtosisOfNumericAtts": 1.086225225943642, "Quartile3MeansOfNumericAtts": -0.48527563636363624, "Quartile3MutualInformation": null, "Quartile3SkewnessOfNumericAtts": 0.5188938465004845, "Quartile3StdDevOfNumericAtts": 0.25351127147986313, "REPTreeDepth1AUC": 0.9744050545454547, "REPTreeDepth1ErrRate": 0.10909090909090909, "REPTreeDepth1Kappa": 0.88, "REPTreeDepth2AUC": 0.9744050545454547, "REPTreeDepth2ErrRate": 0.10909090909090909, "REPTreeDepth2Kappa": 0.88, "REPTreeDepth3AUC": 0.9744050545454547, "REPTreeDepth3ErrRate": 0.10909090909090909, "REPTreeDepth3Kappa": 0.88, "RandomTreeDepth1AUC": 0.9410000000000002, "RandomTreeDepth1ErrRate": 0.10727272727272727, "RandomTreeDepth1Kappa": 0.882, "RandomTreeDepth2AUC": 0.9410000000000002, "RandomTreeDepth2ErrRate": 0.10727272727272727, "RandomTreeDepth2Kappa": 0.882, "RandomTreeDepth3AUC": 0.9410000000000002, "RandomTreeDepth3ErrRate": 0.10727272727272727, "RandomTreeDepth3Kappa": 0.882, "StdvNominalAttDistinctValues": 0, "kNN1NAUC": 0.9911119454545455, "kNN1NErrRate": 0.016181818181818183, "kNN1NKappa": 0.9822 }, "tags": [ { "tag": "study_14", "uploader": "1" }, { "tag": "study_1", "uploader": "0" }, { "tag": "study_141", "uploader": "0" }, { "tag": "study_129", "uploader": "0" }, { "tag": "study_143", "uploader": "0" }, { "tag": "study_234", "uploader": "0" } ], "features": [ { "name": "Class", "index": "40", "type": "nominal", "distinct": "11", "missing": "0", "target": "1", "distr": [ [ "1", "2", "3", "4", "5", "6", "7", "8", "9", "10", "11" ], [ [ "500", "0", "0", "0", "0", "0", "0", "0", "0", "0", "0" ], [ "0", "500", "0", "0", "0", "0", "0", "0", "0", "0", "0" ], [ "0", "0", "500", "0", "0", "0", "0", "0", "0", "0", "0" ], [ "0", "0", "0", "500", "0", "0", "0", "0", "0", "0", "0" ], [ "0", "0", "0", "0", "500", "0", "0", "0", "0", "0", "0" ], [ "0", "0", "0", "0", "0", "500", "0", "0", "0", "0", "0" ], [ "0", "0", "0", "0", "0", "0", "500", "0", "0", "0", "0" ], [ "0", "0", "0", "0", "0", "0", "0", "500", "0", "0", "0" ], [ "0", "0", "0", "0", "0", "0", "0", "0", "500", "0", "0" ], [ "0", "0", "0", "0", "0", "0", "0", "0", "0", "500", "0" ], [ "0", "0", "0", "0", "0", "0", "0", "0", "0", "0", "500" ] ] ] }, { "name": "V22", "index": "21", "type": "numeric", "distinct": "990", "missing": "0", "min": "-1", "max": "1", "mean": "-1", "stdev": "0" }, { "name": "V21", "index": "20", "type": "numeric", "distinct": "861", "missing": "0", "min": "-1", "max": "1", "mean": "-1", "stdev": "0" }, { "name": "V23", "index": "22", "type": "numeric", "distinct": "1223", "missing": "0", "min": "-1", "max": "0", "mean": "-1", "stdev": "0" }, { "name": "V24", "index": "23", "type": "numeric", "distinct": "1150", "missing": "0", "min": "-1", "max": "1", "mean": "-1", "stdev": "0" }, { "name": "V25", "index": "24", "type": "numeric", "distinct": "1112", "missing": "0", "min": "-1", "max": "0", "mean": "-1", "stdev": "0" }, { "name": "V26", "index": "25", "type": "numeric", "distinct": "1100", "missing": "0", "min": "-1", "max": "0", "mean": "0", "stdev": "0" }, { "name": "V27", "index": "26", "type": "numeric", "distinct": "1010", "missing": "0", "min": "-1", "max": "1", "mean": "-1", "stdev": "0" }, { "name": "V28", "index": "27", "type": "numeric", "distinct": "1082", "missing": "0", "min": "-1", "max": "0", "mean": "-1", "stdev": "0" }, { "name": "V29", "index": "28", "type": "numeric", "distinct": "1110", "missing": "0", "min": "-1", "max": "0", "mean": "-1", "stdev": "0" }, { "name": "V30", "index": "29", "type": "numeric", "distinct": "1217", "missing": "0", "min": "-1", "max": "0", "mean": "-1", "stdev": "0" }, { "name": "V31", "index": "30", "type": "numeric", "distinct": "861", "missing": "0", "min": "-1", "max": "1", "mean": "-1", "stdev": "0" }, { "name": "V32", "index": "31", "type": "numeric", "distinct": "898", "missing": "0", "min": "-1", "max": "1", "mean": "-1", "stdev": "0" }, { "name": "V33", "index": "32", "type": "numeric", "distinct": "1309", "missing": "0", "min": "-1", "max": "1", "mean": "0", "stdev": "0" }, { "name": "V34", "index": "33", "type": "numeric", "distinct": "742", "missing": "0", "min": "-1", "max": "0", "mean": "-1", "stdev": "0" }, { "name": "V35", "index": "34", "type": "numeric", "distinct": "838", "missing": "0", "min": "-1", "max": "0", "mean": "0", "stdev": "0" }, { "name": "V36", "index": "35", "type": "numeric", "distinct": "750", "missing": "0", "min": "-1", "max": "0", "mean": "-1", "stdev": "0" }, { "name": "V37", "index": "36", "type": "numeric", "distinct": "797", "missing": "0", "min": "-1", "max": "0", "mean": "-1", "stdev": "0" }, { "name": "V38", "index": "37", "type": "numeric", "distinct": "921", "missing": "0", "min": "-1", "max": "0", "mean": "-1", "stdev": "0" }, { "name": "V39", "index": "38", "type": "numeric", "distinct": "865", "missing": "0", "min": "-1", "max": "0", "mean": "-1", "stdev": "0" }, { "name": "V40", "index": "39", "type": "numeric", "distinct": "1252", "missing": "0", "min": "-1", "max": "0", "mean": "-1", "stdev": "0" }, { "name": "V11", "index": "10", "type": "numeric", "distinct": "861", "missing": "0", "min": "-1", "max": "1", "mean": "-1", "stdev": "0" }, { "name": "V2", "index": "1", "type": "numeric", "distinct": "979", "missing": "0", "min": "-1", "max": "0", "mean": "-1", "stdev": "0" }, { "name": "V3", "index": "2", "type": "numeric", "distinct": "1199", "missing": "0", "min": "-1", "max": "0", "mean": "-1", "stdev": "0" }, { "name": "V4", "index": "3", "type": "numeric", "distinct": "1072", "missing": "0", "min": "-1", "max": "1", "mean": "0", "stdev": "0" }, { "name": "V5", "index": "4", "type": "numeric", "distinct": "1025", "missing": "0", "min": "-1", "max": "0", "mean": "0", "stdev": "0" }, { "name": "V6", "index": "5", "type": "numeric", "distinct": "961", "missing": "0", "min": "-1", "max": "0", "mean": "0", "stdev": "0" }, { "name": "V7", "index": "6", "type": "numeric", "distinct": "965", "missing": "0", "min": "-1", "max": "1", "mean": "-1", "stdev": "0" }, { "name": "V8", "index": "7", "type": "numeric", "distinct": "1003", "missing": "0", "min": "-1", "max": "0", "mean": "0", "stdev": "0" }, { "name": "V9", "index": "8", "type": "numeric", "distinct": "1032", "missing": "0", "min": "-1", "max": "0", "mean": "0", "stdev": "0" }, { "name": "V10", "index": "9", "type": "numeric", "distinct": "1234", "missing": "0", "min": "-1", "max": "0", "mean": "-1", "stdev": "0" }, { "name": "V1", "index": "0", "type": "numeric", "distinct": "861", "missing": "0", "min": "-1", "max": "1", "mean": "-1", "stdev": "0" }, { "name": "V12", "index": "11", "type": "numeric", "distinct": "894", "missing": "0", "min": "-1", "max": "0", "mean": "-1", "stdev": "0" }, { "name": "V13", "index": "12", "type": "numeric", "distinct": "1300", "missing": "0", "min": "-1", "max": "1", "mean": "0", "stdev": "0" }, { "name": "V14", "index": "13", "type": "numeric", "distinct": "696", "missing": "0", "min": "-1", "max": "0", "mean": "-1", "stdev": "0" }, { "name": "V15", "index": "14", "type": "numeric", "distinct": "810", "missing": "0", "min": "-1", "max": "0", "mean": "0", "stdev": "0" }, { "name": "V16", "index": "15", "type": "numeric", "distinct": "727", "missing": "0", "min": "-1", "max": "0", "mean": "-1", "stdev": "0" }, { "name": "V17", "index": "16", "type": "numeric", "distinct": "805", "missing": "0", "min": "-1", "max": "0", "mean": "-1", "stdev": "0" }, { "name": "V18", "index": "17", "type": "numeric", "distinct": "899", "missing": "0", "min": "-1", "max": "0", "mean": "0", "stdev": "0" }, { "name": "V19", "index": "18", "type": "numeric", "distinct": "852", "missing": "0", "min": "-1", "max": "0", "mean": "0", "stdev": "0" }, { "name": "V20", "index": "19", "type": "numeric", "distinct": "1282", "missing": "0", "min": "-1", "max": "0", "mean": "0", "stdev": "0" } ], "nr_of_issues": 0, "nr_of_downvotes": 0, "nr_of_likes": 0, "nr_of_downloads": 0, "total_downloads": 0, "reach": 0, "reuse": 11, "impact_of_reuse": 0, "reach_of_reuse": 0, "impact": 11 }