The choice of parameters, and the design of the network architecture are important factors affecting the performance of deep neural networks. However, this task still heavily depends on trial and error, and empirical results. Considering that there are many design and parameter choices, it is very hard to cover every configuration, and find the optimal structure. In this paper, we propose a novel method that autonomously and simultaneously optimizes multiple parameters of any given deep neural network by using a modified generative adversarial network (GAN). In our approach, two different models compete and improve each other progressively. Without loss of generality, the proposed method has been tested with three different neural network architectures, and three very different datasets and applications. The results show that the presented approach can simultaneously and successfully optimize multiple neural network parameters, and achieve increased accuracy in all three scenarios.