In my talk I'm going to present a quite efficient tool for optimization and constraint satisfaction problems. These methods are called evolutionary or genetic algorithms since their basic concepts mimic the evolution of species. These algorithms were first used to solve discrete problems like scheduling. Later the modified version proved efficient for a class continuous problems, where gradient method and its variants fail either because the objective function isn't differentiable, or the function has many local optima. An example for a specific continuous problem will be presented.
We shall consider weak solutions of nonlinear elliptic boundary value problems and initial-boundary value problems for semilinear and nonlinear parabolic differential equations with certain nonlocal terms. We shall prove theorems on the number of solutions and find multiple solutions. These statements are based on arguments for fixed points of some real functions and operators, respectively, and existence-uniqueness theorems on partial differential equations (without functional terms).