This paper presents a method for optimizing neural network architecture using the Self-Organizing Feature Map (SOFM) algorithm. It explores the automatic design of forward neural network structures by evaluating various parameters such as the number of hidden layers, neurons, and connections based on performance metrics from multiple datasets. Experimental results demonstrate the effectiveness of the proposed approach in minimizing classification errors across four distinct datasets.