Efficient solutions to complex tasks currently dealt with in the area of neurocomputing require sufficient generalization capabilities of the built systems. The formed network structure should, however, support an easy interpretation of the network function, too.
In this paper, we will discuss a new feature selection technique called SCGSIR based on the fast method of scaled conjugate gradients (SCG) and sensitivity analysis. In particular, the inhibition of the network sensitivity impacts a successful pruning of the input neurons.
Experiments performed so far on the problem of binary addition and on real data obtained from the World Bank yield promising results outperforming the considered reference techniques.