In this document, all lectures are summarized. I used this to study for the exam and make my cheatsheet and I passed the exam with a 9, I hope you will do the same :)
Note: using this document, you can easily make your cheatsheet. Make your cheatsheet and use this document to study. Good luck!
Lecture 1
You have a general solution vs a specific solution. Hence, a problem is not the same as a problem
instance. Take for example the 8-queens problem (problem instance) and the N-queens problem
(problem). The properties of a solver/configuration:
- Constructive method: works by extending an empty solution.
- Iterative improvement method: works by improving a solution
- Blind: no heuristics/does not try to be smart or whatsoever.
- Heuristic: try to maximize improvement via “educated guess”
- Correct: the end solution is a correct one (is allowed to be a solution). The space of all
possible configurations, led to by the solver, are always correct.
- Incomplete: at each step in the search trajectory, not all things are done.
Correct but incomplete search trajectory: once the configuration is complete, it is also directly a
correct one.
Evolution Problem solving
Environment Problem
Individual Candidate solution
Fitness Quality
The fitness are the chances for survival and reproduction. The quality is the chance for seeding new
solutions.
Lecture 2 (chapter 1: problems to be solved)
Problems can be classified in different ways: black box model, search problems, optimization vs
constraint satisfaction, NP problems.
, - Black box model
The black box model consists of three components: input, model and output. When one
component is unknown: new problem type.
When the model and desired output is known, the task is to find inputs, this is called
optimization. Examples are: time tables/design specifications/traveling salesman problem.
When we have corresponding sets of inputs and outputs and seek a model that delivers correct
output for every known input, this is called modelling. Modelling problems CAN be transformed
into optimization problems. Examples are: evolutionary machine learning, predicting stock
exchange and voice control systems for smart homes.
When we have a given model and wish to know the outputs that arise under different input
conditions, this is called simulation. Often used to answer the “what-if” question in evolving
dynamic environments. Examples include: weather forecast systems, impact analysis of new tax
systems and evolutionary economics/artificial life.
- Search problems
Simulation is different from optimization/modelling. Optimization/modelling problems search
through a huge space of possibilities. The search space is a collection of all objects of interest
including the desired solution. The distinction between (search) problems and problem-solvers is
that (search) problems define search spaces and problem-solvers move through search spaces (to
find a solution).
- Optimization vs constraint satisfaction
Objective function: a way of assigning a value to a possible solution that reflects its quality on scale.
You can maximize the objective function (number of un-checked queens) or minimize the objective
function (length of a tour visiting, given set of cities). Constraint: binary evaluation telling whether a
given requirement holds or not. The constraint can be either good or bad when the answer is yes/no,
depending on the constraint. We can combine both, following to the following table:
Constraint problems can be transformed into optimization problems.
, - NP problems
So far, the problem type was depending on the problem only. The classification scheme can also be
looked at by looking at properties of the problem solver. The benefit is that we can define “problem
difficulty”, by defining when a problem is hard to solve. Key notions are:
- Problem size: number of problem variables (dimensionality) and the number of different
values for the problem variables
- Running-time: number of operations the algorithm takes to terminate. Worst-case as a
function of problem size. Polynomial, super-polynomial and exponential
- Problem reduction: transforming the current problem into another via mapping
The hardness/complexity of a problem can now be classified as:
- Class P: some algorithm can solve the problem in polynomial time (worst-case running-time
for problem size n is less than F(n) for some polynomial formula F)
- Class NP: problem can be solved and any solution can be verified within polynomial time by
some algorithm. Note: P is a subset of NP.
- Class NP-complete: problem belongs to class NP and any other problem in NP can be
reduced to this problem by an algorithm running in polynomial time.
- Class NP-hard: problem is at least as hard as any other problem in NP-complete but solution
cannot necessarily be verified in polynomial time.
P is different from NP-hard. It is unknown whether P is different from NP. If P=NP then P=NP=NP-
complete.
Lecture 3 (chapter 2: The origins)
If evolution can develop intelligence, then artificial evolution can develop artificial intelligence.
Evolutionary algorithms can be used as heuristic problem solving. A robust problem solving
technique is needed: algorithms that work on a wide range of problems without much problem
specific adjustments.
Turing (1948) already proposed “genetical/evolutionary search”. Bremmermann (1962): optimization
through evolution and recombination. Rechenberg (1964): introduces evolution strategies. L. Fogel,
Owens and Walsh (1965): introduce evolutionary programming. Holland (1975): introduces genetic
algorithms. Koza (1992): genetic programming.
Darwinian evolution, pillars:
Survival of the fittest arises, because all environments have finite resources, life forms have basic
instinct/lifecycles geared towards reproduction (therefore some kind of selection is inevitable. Those
individuals that compete for the resources most effectively have increased chance of reproduction.
Fitness in natural evolution is a derived, secondary measure, i.e., we (humnas) assign a high fitness
to individuals with many offspring.
Diversity drives change. Phenotypic traits: Physical and behavioral differences that affect response to
environment. Partly determined by inheritance, partly by factors during development (nature vs
nurture). Unique to each individual, partly as a result of random changes. If phenotypic traits lead to
higher chances of reproduction, they can be inherited. Then they will tend to increase in subsequent
generations, leading to new combinations of traits.
Les avantages d'acheter des résumés chez Stuvia:
Qualité garantie par les avis des clients
Les clients de Stuvia ont évalués plus de 700 000 résumés. C'est comme ça que vous savez que vous achetez les meilleurs documents.
L’achat facile et rapide
Vous pouvez payer rapidement avec iDeal, carte de crédit ou Stuvia-crédit pour les résumés. Il n'y a pas d'adhésion nécessaire.
Focus sur l’essentiel
Vos camarades écrivent eux-mêmes les notes d’étude, c’est pourquoi les documents sont toujours fiables et à jour. Cela garantit que vous arrivez rapidement au coeur du matériel.
Foire aux questions
Qu'est-ce que j'obtiens en achetant ce document ?
Vous obtenez un PDF, disponible immédiatement après votre achat. Le document acheté est accessible à tout moment, n'importe où et indéfiniment via votre profil.
Garantie de remboursement : comment ça marche ?
Notre garantie de satisfaction garantit que vous trouverez toujours un document d'étude qui vous convient. Vous remplissez un formulaire et notre équipe du service client s'occupe du reste.
Auprès de qui est-ce que j'achète ce résumé ?
Stuvia est une place de marché. Alors, vous n'achetez donc pas ce document chez nous, mais auprès du vendeur gideonrouwendaal. Stuvia facilite les paiements au vendeur.
Est-ce que j'aurai un abonnement?
Non, vous n'achetez ce résumé que pour €10,56. Vous n'êtes lié à rien après votre achat.