Questions
- An important operation in evolutionary algorithms is recombination. Which of the following recombination (i.e. crossover) operators we can use for integers?
- single arithmetic
- subtree swapping
- uniform crossover
- whole arithmetic
- What is (in a nutshell) backpropagation in the context of neural networks?
- Gradient descent + chain rule
- Propagating through a network back
- Neuroevolutionary algorithm
- Belief propagation
- In convolutional neural networks, using a pooling layer results in:
- increasing a size of a feature map
- adding zeros to an input
- processing input in a recurrent manner
- reducing a size of a feature map
- Can we train a neural network if it is non-differentiable?
- No.
- Yes, by using, e.g., an evolutionary algorithm.
- Yes, by using, e.g., convex optimization.
- Yes, by using, e.g., backpropagation.
- Let us assume a neural network with multiple layers and a sigmoid function as an activation function. Such neural network could suffer from a problem called:
- vanishing gradient problem
- credit assignment problem
- vanishing assignment problem
- credit gradient problem
- Let us assume a random variable $x$ that is distributed by $p(x\mid w)$, where w denotes parameters. We observe $N$ datapoints $\{x_1, ..., x_N \}$, and we assume that all observations are iid. We want to find parameters $w$. How do we call the optimization function in this optimization problem?
- the fitness function
- the indicator function
- the likelihood function
- the survival function
- Which of the below can describe best the process of visiting the regions in a search space that are not previously visited?
- Exploration
- Genetic algorithm
- Exploitation
- Metaheuristic