This paper presents the results of Persian handwritten word recognition based on Mixture of EXPERTS technique. In the basic form of ME the problem space is automatically divided into several subspaces for the EXPERTS, and the outputs of EXPERTS are combined by a gating network. In our proposed model, we used Mixture of EXPERTS Multi Layered Perceptrons with Momentum term in the classification Phase. Applying this term makes three effects in our system: a) increase convergence rate, b) obtain the optimum performance in our system, c) and escape from the local minima on the error surface. We produce three different Mixture of EXPERTS structure. Experimental result for proposed method show an error rate reduction of 6.42% compare to the mixture of MLPs EXPERTS. Comparison with some of the most related methods indicates that the proposed model yields excellent recognition rate in handwritten word recognition.