The structure of a simple back propagation neural networks is
shown. We have an input layer (X is the vector of the
input i.e. design variables). Then we have a number of hidden
layers and an output layer. The output layer gives us the vector
Y, that is the vector of the objective functions.
| ![]() |