Very simple video showing annealing in metallurgy / jewelry making 👇

Annealing in metallurgy can also take this form

Remember how at **HillClimbing **and **TabuSearch** we looked at all the neighbors and picked the best one? Simulated annealing does the same thing, but... But with some probability, it is willing to accept a worse solution. If we decide on a worse solution at the right moment, we can later get to a better one = get out of the local minimum/maximum.

Before we dive into the algorithm, I will briefly consider when simulated annealing is used... It is an ideal candidate if we have to optimize several parameters of the same type at once (vectors). A classic example is the problem of a business traveler, or the optimization of fuel consumption on a road. If you want to try this algorithm, I again recommend codingame.com

- The business traveler's problem.
- Landing on Mars.
- Bot programming - autonomous vehicle races.
- and more.

I personally use this algorithm when **I want to reach at least a satisfactory solution extremely quickly.** If I want the solution to be as accurate as possible, I turn to genetic algorithms, which we will cover later.

As with other optimization algorithms, we will create a random solution. However, its optimization will consist in lowering the temperature a pre-defined number of times (n) and based on this temperature and the generated neighbor we can accept or reject it. A simplified form of the algorithm would work like this:

*solution**= random solution***temperature**= initial temperaturetimes*n*

a.*neighbor_solution**= neighborhood solution based on solutions*b.*sneighboring_solution**= is better than**solution*

i. true ⇒ ** solution** =

ii. false ⇒** solution** =

c. lower temperature

4. Return best **solution **found yet

Kedy je však teplota dosť vysoká aby sme ** susedne_riesenie** akceptovali (bod 3b)? Sem prichádza na rad klasický vzorec, ktorý nám pozorovanie žíhania prinieslo.

But when is the temperature high enough for us to accept the **neighbor_solution **(point 3b)? This is where the classic formula that the observation of annealing brought us comes into play. We accept **neighbor_solution** with probability:

*P(∆E)=e^{\frac{-∆E}{t}}*

Where ∆E is the score difference of the neighboring solution and the original solution. Although the formula looks a little scary, the implementation of this formula is actually simple.

We can reduce the temperature in different ways (point 3c). You can probably think of several ways, but let's write them down:

- Linear temperature reduction: T' = T - α
- Geometric temperature reduction: T' = T . α
- Slow temperature reduction: T' = T/(1+βT)

When I studied this algorithm, it became clear that many people have a different idea of its use. Basically there are 2 main use cases:

- We get a neighbor with a small change and accept him with a certain probability.
- The higher the temperature, the more significant changes we can make in the neighbor.

Furthermore, the algorithm can vary, for example, in when we change the temperature. Someone changes the temperature in each iteration, someone lets the algorithm cool down every n-iterations.

No one can simply say how to set the initial temperature, and no one can say how fast to let the temperature cool down (value of α).

Let's see again how the eight queens problem can be solved in that case. If you are not already familiar with this issue, see the article dedicated to TabuSearch.

1. We create a random **solution (1).**

2.* **We start the main cycle (3) **n** times.*

3. in the cycle we get a new **temperature** (3c). Although I wrote about gradually lowering the temperature, I can also get a lower and lower temperature this way:

4. we get a new **neighbor_solution **(3a). In our case, I will move one random queen to a random row in the same column.

5. if we pass the acceptance function (3b), we save the new neighbor as the current solution.

**You can find the entire program on GitHub**

Unlike TabuSearch, simulated annealing found a solution for 20 queens and even found it in 30 milliseconds, for 40 queens in 40-200 ms. For 50 checkers I managed to find a solution in 200-800 ms. The biggest chessboard I solved was 80x80 (~10s). Such results are unimaginable with previous algorithms.

Solution "8, 13, 3, 17, 7, 10, 4, 0, 18, 15, 6, 11, 19, 12, 16, 2, 5, 1, 9, 14" found in 3028 iterations and 30ms

I have already mentioned the problem of the business traveler. This problem deserves a separate article, so I will not cover it here. Using simulated annealing, my solution looks like this.

For 50 cities, simulated annealing finds a satisfactory solution within a few seconds.

❗️ Attention! The video is significantly slowed down 🙂

So what? Are you starting to feel like we're getting closer to a little smarter programs? The Genetic Algorithm awaits us next time. One of my favorites!

Potrebujete pomôct s vaším digitálnym produktom? Kontaktujte nás!

hello@goodrequest.com

Do you need help with your digital product? Contact us!

hello@goodrequest.com

High quality content once a month. No spam, we promise.

Your personal data is processed in accordance with our Memorandum on Personal Data Protection.