This paper analyzes a modified feed forward multilayer perceptron (FMP) by incorporating an autoregressive moving average (ARMA) model at each neuron, applying the backpropagation learning algorithm with stability analysis based on the Lyapunov function. It introduces adaptive learning factors to enhance the network's performance and demonstrates that this approach improves training efficiency while mitigating oscillations. Simulation results confirm that the adaptive learning factor leads to better convergence in neural network training compared to static learning factors.