Challenge: sum of squares. Let's break 299 - Printable Version +- HP Forums (https://www.hpmuseum.org/forum) +-- Forum: HP Calculators (and very old HP Computers) (/forum-3.html) +--- Forum: General Forum (/forum-4.html) +--- Thread: Challenge: sum of squares. Let's break 299 (/thread-9962.html) |
Challenge: sum of squares. Let's break 299 - pier4r - 01-18-2018 09:32 PM Some of you may have seen the numberphile video about the "sum of squares" problem. The problem is simple. One has the following sequence: 1,2,3,4,5,6,7,8,9,10,11,12,13,14,15 The objective is to rearrange them in a way that every two adjacent numbers, if added, are equal to a square of an integer number. The problem is to use all the numbers. Example 1,3,6,10,15 (it fails then) Don't read below if you want to give it a try. Code:
then in the video they said they tested all the sequences up to 299. From 25 to 299 they found a way and likely there will be always a way. The point is, using only real calculators (surely someone already run some programs until one million on some pc/tablet/smartphone), could we break 299 ? I guess the 50g, prime, dm42 have good chances to break 299 if programmed in a clever way, given enough time. RE: Challenge: sum of squares. Let's break 299 - TheKaneB - 01-18-2018 10:58 PM That's a tricky one! I bet the real challenge would be to find an efficient algorithm that cuts down the number of trials. Trying every possible permutation of 300 numbers doesn't look like a clever way to find a good sequence quickly RE: Challenge: sum of squares. Let's break 299 - Didier Lachieze - 01-19-2018 12:08 AM I have a program that finds the solution for the sequence 1 to 15 in 2 seconds on my Prime, but then it takes 45 seconds for the sequence 1 to 25. So it needs a significant optimization to reach 299. Solution for 1 to 25 is: Code: spoiler RE: Challenge: sum of squares. Let's break 299 - Paul Dale - 01-19-2018 04:29 AM If you follow the comments and links from the numberphile video, it looks like the maximum has been pushed up very significantly. Assuming the implementation was correct and addressed the problem in question etc. Baring an inspired approach, the problem comes down to finding a Hamilton path in a graph. This is a NP complete problem. Pauli RE: Challenge: sum of squares. Let's break 299 - Thomas Okken - 01-19-2018 04:29 AM (01-18-2018 09:32 PM)pier4r Wrote: then in the video they said they tested all the sequences up to 299. From 25 to 299 they found a way and likely there will be always a way. It doesn't say so in so many words, but that made me think that there were no solutions below 25, and that is not the case: n = 15: 8 1 15 10 6 3 13 12 4 5 11 14 2 7 9 n = 16: 8 1 15 10 6 3 13 12 4 5 11 14 2 7 9 16 n = 23: 2 23 13 12 4 21 15 10 6 19 17 8 1 3 22 14 11 5 20 16 9 7 18 I cheated and wrote a simple brute-force backtracking algorithm in C++: Code: #include <stdio.h> In a nutshell: the algorithm would have to be a lot smarter to make a dent in this problem. It starts to get painfully slow in the 60s -- on a 1.3 GHz i5, not a speed demon by today's standards, but still, imagine how this would fare on a calculator. The case n = 60 took 6,619,547,574 iterations... RE: Challenge: sum of squares. Let's break 299 - Paul Dale - 01-19-2018 06:31 AM Instead of a brute force backtracking algorithm, build a graph with the numbers as nodes and edges where they sum to a square. Then find a Hamilton walk with a backtracking search. Pauli RE: Challenge: sum of squares. Let's break 299 - Thomas Okken - 01-19-2018 06:48 AM (01-19-2018 06:31 AM)Paul Dale Wrote: Instead of a brute force backtracking algorithm, build a graph with the numbers as nodes and edges where they sum to a square. Then find a Hamilton walk with a backtracking search. Why would that perform any better than my algorithm? For each number, I'm already finding potential successors in O(1) time, and I'm eliminating already-visited numbers with an array lookup. Using a graph *might* be more efficient by some constant, but it won't do anything for the overall time complexity. RE: Challenge: sum of squares. Let's break 299 - Didier Lachieze - 01-19-2018 12:15 PM (01-19-2018 12:08 AM)Didier Lachieze Wrote: I have a program that finds the solution for the sequence 1 to 15 in 2 seconds on my Prime, but then it takes 45 seconds for the sequence 1 to 25. I have improved my program and now it takes 2.5 seconds for the sequence 1 to 25, however it's still far from handling 299. Here are some timings in seconds for different values from 25 to 52 on the Virtual Calculator: [attachment=5570] RE: Challenge: sum of squares. Let's break 299 - pier4r - 01-19-2018 12:40 PM interesting, some are larger than others even with less nodes (or I read it wrongly) RE: Challenge: sum of squares. Let's break 299 - Didier Lachieze - 01-19-2018 12:47 PM Yes, it depends how fast it finds a solution in exploring the possible ones. So even if there is a global trend of timing increase with the increase of the sequence lenght, some timings can be shorter than the previous one(s). RE: Challenge: sum of squares. Let's break 299 - Didier Lachieze - 01-19-2018 04:26 PM A few more timings, up to 64: [attachment=5571] [attachment=5572] RE: Challenge: sum of squares. Let's break 299 - Thomas Okken - 01-19-2018 04:33 PM (01-19-2018 04:26 PM)Didier Lachieze Wrote: A few more timings, up to 64: Are those timings in seconds? May we see your code? RE: Challenge: sum of squares. Let's break 299 - Didier Lachieze - 01-19-2018 04:56 PM Yes, the timings are in seconds on the HP Prime Virtual Calculator, so it would take significantly more time on the physical Prime. Here is my code (sorry, there is no comments), I'm using a recursive function to go through the different paths to find the solution. This may not be the most effective way of doing it, but it's quite simple and with recursion the backtracking is "built-in". SoS(n) is the main function, that you call with the number of terms of the sequence. rS(l) is the recursive function where most of the time is spent. tSoS(n) is what I use to time the SoS function, it stores the result in L2. To optimize the execution, at the beginning of SoS(n), I build a list of n sublists, each sublist being the list of numbers which added to sublist index give a square. Code: rS(l) Some results (reversed from what my program returns to start with the lowest number): Code: 57: RE: Challenge: sum of squares. Let's break 299 - Thomas Okken - 01-20-2018 05:43 PM (01-19-2018 04:29 AM)Paul Dale Wrote: If you follow the comments and links from the numberphile video, it looks like the maximum has been pushed up very significantly. Assuming the implementation was correct and addressed the problem in question etc. What's NP-complete is the problem of finding Hamilton paths in graphs in general. For certain specific classes of graphs, that doesn't necessarily apply, like graphs where no node connects to more than two edges. The lady in the video, who wrote the program that checked all cases up to 299, must have found something to exploit, because with the apparently roughly exponential way the CPU time in my algorithm increases, it wouldn't get to 299 in a million years, even if I parallellized the heck out of it and ran it on the GPU clusters at work. (I should let it run a bit longer than I have so far to see how the CPU usage evolves, though.) RE: Challenge: sum of squares. Let's break 299 - ggauny@live.fr - 01-20-2018 06:15 PM Hi, May I ask the link of the video ? Thanks. RE: Challenge: sum of squares. Let's break 299 - John Keith - 01-20-2018 08:19 PM (01-20-2018 05:43 PM)Thomas Okken Wrote: What's NP-complete is the problem of finding Hamilton paths in graphs in general. For certain specific classes of graphs, that doesn't necessarily apply, like graphs where no node connects to more than two edges. The lady in the video, who wrote the program that checked all cases up to 299, must have found something to exploit, because with the apparently roughly exponential way the CPU time in my algorithm increases, it wouldn't get to 299 in a million years, even if I parallellized the heck out of it and ran it on the GPU clusters at work. (I should let it run a bit longer than I have so far to see how the CPU usage evolves, though.) A straightforward backtracking algorithm will eventually bog down no matter how much computational power you throw at it, that's just the nature of combinatorial problems. I would think that Simulated Annealing or one of its many variants would be more likely to succeed. The objective function would be simply the number of pair sums that are perfect squares. John RE: Challenge: sum of squares. Let's break 299 - Thomas Okken - 01-20-2018 09:08 PM (01-20-2018 06:15 PM)ggauny@live.fr Wrote: May I ask the link of the video ? Part 1: http://youtu.be/G1m7goLCJDY Part 2: http://youtu.be/7_ph5djCCnM RE: Challenge: sum of squares. Let's break 299 - Paul Dale - 01-20-2018 10:13 PM (01-20-2018 05:43 PM)Thomas Okken Wrote: What's NP-complete is the problem of finding Hamilton paths in graphs in general. For certain specific classes of graphs, that doesn't necessarily apply, like graphs where no node connects to more than two edges. The lady in the video, who wrote the program that checked all cases up to 299, must have found something to exploit, because with the apparently roughly exponential way the CPU time in my algorithm increases, it wouldn't get to 299 in a million years, even if I parallellized the heck out of it and ran it on the GPU clusters at work. (I should let it run a bit longer than I have so far to see how the CPU usage evolves, though.) I think there was mention of trying to link the new node directly into the previous graph. Starting the search from there feels like it would be a win. It kind of makes sense that much of the graph will often be unchanged with the addition of a new node. Still exponential time problems always become infeasible. Pauli RE: Challenge: sum of squares. Let's break 299 - Thomas Okken - 01-20-2018 10:31 PM (01-20-2018 10:13 PM)Paul Dale Wrote: I think there was mention of trying to link the new node directly into the previous graph. Starting the search from there feels like it would be a win. It kind of makes sense that much of the graph will often be unchanged with the addition of a new node. Still exponential time problems always become infeasible. Or take it a step further and keep track of all the paths found so far, so that if extending the previous one doesn't work, you can try even earlier ones as well. I think something like that may be necessary (barring any deeper insights) because it strikes me in those videos how often two successive paths appear to be completely different. (Bonus question: how does the number of possible paths evolve with rising n?) RE: Challenge: sum of squares. Let's break 299 - Thomas Okken - 01-21-2018 05:41 AM I let my code run for over twelve hours, by which time it had gotten to 82. Here's how the CPU time use progressed, in a log plot, N vs CPU time in seconds: [attachment=5580] It does indeed look exponential, as expected. I fed the data to Free42 STAT, removing the times of 0.000 seconds, and got an exponential fit with a correlation coefficient of 0.96 and an exponent of 0.281. Extrapolated run time for N=300: 4.78e22 years. The connectedness of the graph of pairs of numbers that sum to squares grows pretty slowly. When you add N to the graph for 1 through N-1, the number of edges added is floor(sqrt(2n-1)) - ceil(sqrt(n+1)) + 1, so, asymptotically, each successive N adds (sqrt(2)-1)*sqrt(N) edges. That's slow, but still fast enough that dumb backtracking isn't useful. |