Publication: Publication Date: 1 May 2013Testing Method:

The current work studies the relevance of features in time- and frequency-domain given a scattered Ground Penetrating Radar (GPR) wave. This wave is used to identify inclusions, such as reinforcement bars and fissures, in concrete structures. The right choice of features is fundamental to the design of intelligent machines to support the detection, qualification, and quantification of fissures in concrete. Although the extension of its results to other types of materials is expected to be possible, this work focuses on the analysis of the problem of discovering the characteristics of cylindrical materials inside a concrete structure, which may be conductor, water, or air. Both noiseless and features selected given a white Gaussian noise are considered in simulated data. Some features were extracted, and those selected are presented, indicating that the features in time- and frequency-domain are complementary and relevant. Classification and regression models with different number of features indicate that not all features available are needed to achieve satisfactory performance. Moreover, decreasing the number of features also decreases the computational burden.

References

1. D. J. Daniels. Ground Penetrating Radar. 2nd ed., The Institute of Electrical Engineers, London, UK
(2004).
2. S. Caorsi and G. Cevini. IEEE Geoscience and Remote Sensing Letters 2:3–7 (2005).
3. X. L. Travassos, D. A. G. Vieira, N. Ida, C. Vollaire, and A. Nicolas. IEEE Trans. on Magnetics
44:1630–1633 (2008).
4. W. M. Caminhas, D. A. G. Vieira, and J. A. Vasconcelos. Neurocomputing 55:771–778 (2003).
5. X. L. Travassos, D. A. G. Vieira, N. Ida, C. Vollaire, and A. Nicolas. IEEE Trans. on Magnetics
44:994–997 (2008).
6. X. L. Travassos, D. A. G. Vieira, N. Ida, and A. Nicolas. Research in Nondestructive Evaluation
20:71–93 (2009).
7. D. A. G. Vieira, X. L. Travassos, R. R. Saldanha, and V. Palade. Neurocomputing 72:2270–2275,
Elsevier Science Publishers B. V., Amsterdam, The Netherlands (2009).
8. K. Kira and L. Rendell. Proceedings of the Ninth International Workshop on Machine Learning,
pp. 249–256 (1992).
9. A. Giannopoulos. Construction and Building Materials 10:755–762 (2005).
10. T. M. Cover and P. E. Hart. IEEE Trans. Inform. Theory 13:21–27 (1967).
11. T. Mitchell. Machine Learning. 1st ed., McGraw-Hill, New York (1997).
12. V. N. Vapnik. The nature of statistical learning theory. Statistics for Engineering and Information
Science. 2nd ed., Springer Verlag, New York, NY (1999).
13. T. Hastie, R. Tibshirani, and J. H. Friedman. The Elements of Statistical Learning. 2nd ed., Springer,
New York, 5th printing with corrections (2011).
14. H. Liu and L. Yu. IEEE Transactions on Knowledge and Data Engineering 17:491–502 (2005).
15. M. Dash and H. Liu. Dimensionality Reduction. Encyclopedia of Computer Science and Engineering.
John Wiley & Sons, Hoboken, NJ. (2007).
16. H. Liu, E. R. Dougherty, J. G. Dy, K. Torkkola, E. Tuv, H. Peng, C. Ding, F. Long, M. Berens,
L. Parsons, Z. Zhao, L. Yu, and G. Forman. IEEE Intelligent Systems 20:64–76 (2005).
17. A. Jain and D. Zongker. IEEE Trans. Pattern Anal. Mach. Intell. 19:153–158 (1997).
18. Pavel Krzek, Josef Kittler, and Vclav Hlavc. Improving Stability of Feature Selection Methods. CAIP
2007, pp. 929–936 (2007).
19. K. Kira and L. A. Rendell. A Practical Approach to Feature Selection. Proc. Ninth Int’l Conf. Machine
Learning, pp. 249–256 (1992).
20. Alexandros Kalousis, Julien Prados, and Melanie Hilario. Stability of Feature Selection Algorithms.
ICDM, pp. 218–225, Fifth IEEE International Conference on Data Mining (ICDM’05) (2005).
21. I. Guyon, S. Gunn, M. Nikravesh, and L. A. Zadeh. Feature Extraction: Foundations and Applications.
Studies in Fuzziness and Soft Computing, Springer (2006).
22. Yijun Sun and Jian Li. Iterative RELIEF for Feature Weighting. Proceedings of the 23rd International
Conference on Machine Learning, Pittsburgh, PA, USA (2006)
23. Igor Kononenko. Estimating Attributes: Analysis and Extensions of RELIEF. Proceedings of the
European Conference on Machine Learning. Springer Verlag, New York (1994).
24. Yijun Sun, Sinisa Todorovic, and Steve Goodison. Local-Learning-Based Feature Selection for
High-Dimensional Data Analysis. IEEE Transactions on Pattern Analysis and Machine Intelligence,
vol. 32, no. 9, pp. 1610–1626 (Sep. 2010).
25. R. Gilad-Bachrach, A. Navot, and N. Tishby. Margin Based Feature Selection-Theory and Algorithms.
Proc. 21st Int’l Conf. Machine Learning, pp. 43–50 (2004).
26. The MathWorks, Inc. MATLAB - Data Analysis - R2011b. Online only. Revised for MATLAB Version
7.13 (R2011b) (Sep. 2011).
27. Alan V. Oppenheim, Alan S. Willsky, and S. Hamid. Signals and systems. 2nd ed., Prentice-Hall,
Englewood Cliffs, N. J. (1996).
28. C. Warren and A. Giannopoulos. Creating finite-difference time-domain models of commercial
ground-penetrating radar antennas using Taguchi’s optimization method. Geophysics, vol. 76,
no. 2, pp. G37–G47, SEG (2011).
29. Aleix M. Martnez and Avinash C. Kak. PCA versus LDA. IEEE Transactions on Pattern Analysis and
Machine Intelligence, vol. 23, no. 2, pp. 228–233 (Feb. 2001).
30. D. A. G. Vieira, R. H. C. Takahashi, V. Palade, J. A. Vasconcelos, and W. M. Caminhas. The Q-Norm
Complexity Measure and the Minimum Gradient Method: A Novel Approach to the Machine
Learning Structural Risk Minimization Problem. IEEE Trans. on Neural Networks, vol. 19, no. 8,
pp. 1415–1430 (2008).

Metrics

Usage | Shares |
---|---|

Total Views 17 Page Views |
Total Shares 0 Tweets |

17 0 PDF Downloads |
0 0 Facebook Shares |

Total Usage | |

17 |