Meta learning is a step towards an artificial general intelligence, where neural architecture search is at the forefront. The methods dominating the field of neural architecture search are recurrent neural networks and evolutionary algorithms. The state-of-the-art evolution-based neural architecture search algorithms evolves only the structure of the neural networks. This thesis proposes an evolution based neural architecture search method focused on transferring both structure and knowledge together through the generations. Experiments are conducted to review multi-objective optimization for evaluating neural networks through both their knowledge and structure. A trade-off when transferring knowledge is also transferring bad traits such as overfitting. An alternative pattern-based representation is tested to explore how much of the knowledge should be transferred. A comparison between local search hill climb and evolution is also conducted to find the effects of having a population by it self. The proposed system finds a top performing architecture in short time. Transfer learning proves to increase the both the stability and speed of the evolution-based neural architecture search. Optimizing neural networks through multi-objective optimization proves to work well given good objectives. Optimizing on structure yields a much more diverse population than optimizing on knowledge. The population is important to have as the choices taken by the search are crucial for its success.