Normalization of data is a pragmatic approach to decomposition tables and eliminate duplication and unwanted data inconsistency, such as addition, modification and delete irregularities.
It is important because it removes a variety of inconsistencies data, that may hinder data analysis.
Apart from deleting data, entering more information or upgrading current information, some of these irregularities can occur.
a type of machine learning based on artificial neural networks in which multiple layers of processing are used to extract progressively higher level features from data.