The parameters of neural network mainly include: learning rate, cost function, activation function, number of learning rounds, batch size, network layers and the number of neurons in each layer. The first three factors mainly affect the convergence rate of the model, the number of learning rounds and batch size mainly determine the training time of the model, and the accuracy of the model mainly depends on the number of network layers and neurons.
In the 1d-cnn model, the window corresponds to the convolution kernel of the convolution layer and the pooling window of the pooling layer. The convolution kernel of the first convolution layer is larger, which can obtain larger receptive field, similar to signal windowing. The larger the window width is, the higher the frequency resolution is. The activation function is the relu function, by which the positive value is unsaturated. Compared with other functions, the weight update is faster, and the risk of gradient disappearance and explosion can be reduced. At the same time, in order to enhance the generalization and anti noise ability of the model, the first layer of large convolution kernel is treated with 0.6 dropout.
The rationality of the selected parameters can be proved by the error curves of the verification set and the training set. The results are shown in the figure.
It can be seen from the figure that the model begins to converge when training to the fifth round, and the loss curves of training set and verification set basically coincide after the 20th round, and the loss rates of both are about 0.37. It shows that the generalization performance of the model is good and the selected parameters are reasonable.