24

The N-Queens Problem:

This problem states that given a chess board of size N by N, find the different permutations in which N queens can be placed on the board without any one threatening each other.

My question is:
What is the maximum value of N for which a program can calculate the answer in reasonable amount of time? Or what is the largest N we have seen so far?

Here is my program in CLPFD(Prolog):

generate([],_).
generate([H|T],N) :-
   H in 1..N ,
   generate(T,N).

lenlist(L,N) :-
   lenlist(L,0,N).

lenlist([],N,N).
lenlist([_|T],P,N) :-
   P1 is P+1,
   lenlist(T,P1,N).

queens(N,L) :-
   generate(L,N),lenlist(L,N),
   safe(L),
   !,
   labeling([ffc],L).

notattack(X,Xs) :-
   notattack(X,Xs,1).

notattack(X,[],N).
notattack(X,[Y|Ys],N) :-
   X #\= Y,
   X #\= Y - N,
   X #\= Y + N,
   N1 is N + 1,
   notattack(X,Ys,N1).

safe([]).
safe([F|T]) :-
   notattack(F,T),
   safe(T).

This program works just fine, but the the time it takes keeps on increasing with N. Here is a sample execution:

?- queens(4,L).

L = [2, 4, 1, 3] ;

L = [3, 1, 4, 2] ;

No

This means you place the 4 queens at Row 2 in Column1, Row 4 in Column 2, Row 1 in 3 and Row 3 in 4.(In a 4 By 4 chess board)

Now lets see how this program performs(Time taken in calculating the first permutation):
For N=4,5.....10 Computes within a second
For N=11-30 Takes between -1-3 seconds
For N=40..50 Still calculates within a minute
At N=60 It goes out of Global stack(Search space being enormous).

This was a past Homework problem. (The original problem was just to code N-Queens)

I am also interested in seeing alternate implementations in other languages(which performs better than my implementation) or If there is room for improvement in my algorithm/program

Cœur
  • 37,241
  • 25
  • 195
  • 267
  • 7
    Wikipedia has a bunch of algorithms, and indicates that at least one of them can solve the 1,000,000-queens problem in ~50 steps. http://en.wikipedia.org/wiki/Eight_queens_puzzle#The_eight_queens_puzzle_as_an_exercise_in_algorithm_design – Carl Norum Dec 07 '09 at 23:13
  • I've done very little with Prolog, and a long time ago. But I recall that although Prolog is a prescriptive (?) rather than imperative language, it's possible to order and otherwise set up the rules in such a way as to nudge processing in a preferred and hopefully better performing direction. I'm not up to it, but I'd guess your solution could be streamlined this way to achieve much better performance. – Carl Smotricz Dec 07 '09 at 23:21
  • I guess my program works similarly to the animation showed in wikipedia. It places one queen and then strikes of the positions which that queen will kill and so on. But 1 million queens problem in less than 50 steps thats insane. I think this wikipedia article is not entirely correct. Even with using contraints I dont think it can be solved in 50 steps –  Dec 07 '09 at 23:29
  • Also it may be trivial..But I could use the length/2 inbuilt function to reduce the code size.But still the performance would be the same –  Dec 07 '09 at 23:32
  • 1
    Well, the 50-step implementation doesn't do a complete depth-first search like your algorithm is doing. If you read the article you'll get some more information. In particular, it requires an initial setup that is 'reasonably good'. – Carl Norum Dec 07 '09 at 23:35
  • @Carl Smotricz My program uses the Contraints effectively. It is not blindly calculating all the permutations and testing it. It uses constraints to find possible future permutations intelligently. What you are saying true..It can be fine tuned by placing cuts..It woont improve the performance significantly. –  Dec 07 '09 at 23:42
  • @Carl Norum.. Well yes you are right. But this algorithms assumes that you have a good intial configuration, plus it does not guarantee a solution as it might be stuck in a local minimum. Anyways its a good alternative. –  Dec 07 '09 at 23:46
  • 1
    http://www.math.utah.edu/~alfeld/queens/queens.html shows the actual complexity we are dealing with. Check out the number of steps involved as N increases –  Dec 08 '09 at 09:31
  • You may as well use `length/2`. I can't comment on CLPFD, but I believe it's common to implement built-ins in heavily optimized lower level code, rather than in prolog itself. So you may as well decrease your code size and take advantage of possible optimizations that may have already been done for you. – nedned Dec 08 '09 at 13:07
  • The cut is misplaced, also the fact in `generate/2` is incorrect. – false Jul 22 '15 at 08:58
  • Although this is an old thread, it is very interesting. It will be good if one can describe how this code is working, preferably by adding comments to the code. – rnso Sep 11 '17 at 14:27

9 Answers9

12

This discussion conflates three different computational problems: (1) Finding a solution to the N queens problem, (2) Listing all solutions for some fixed N, and (3) counting all of the solutions for some fixed N. The first problem looks tricky at first for a size of board such as N=8. However, as Wikipedia suggests, in some key ways it is easy when N is large. The queens on a large board don't communicate all that much. Except for memory constraints, a heuristic repair algorithm has an easier and easier job as N increases.

Listing every solution is a different matter. That can probably be done with a good dynamic programming code up to a size that is large enough that there is no point in reading the output.

The most interesting version of the question is to count the solutions. The state of the art is summarized in a fabulous reference known as The Encyclopedia of Integer Sequences. It has been computed up to N=26. I would guess that that also uses dynamic programming, but unlike the case of listing every solution, the algorithmic problem is much deeper and open to further advances.

Guy Coder
  • 24,501
  • 8
  • 71
  • 136
Greg Kuperberg
  • 3,995
  • 22
  • 24
10

Loren Pechtel said: "Now for some real insanity: 29 took 9 seconds. 30 took almost 6 minutes!"

This fascinating lack of predictability in backtrack-complexity for different board sizes was the part of this puzzle that most interested me. For years I've been building a list of the 'counts' of algorithm steps needed to find the first solution for each board size - using the simple and well known depth-first algorithm, in a recursive C++ function.

Here's a list of all those 'counts' for boards up to N=49 ... minus N=46 and N=48 which are still work-in-progress:

http://queens.cspea.co.uk/csp-q-allplaced.html

(I've got that listed in the Encyclopedia of Integer Sequences (OEIS) as A140450)

That page includes a link to a list of the matching first solutions.

(My list of First Solutions is OEIS Sequence number A141843)

I don't primarily record how much processing time each solution demands, but rather I record how many failed queen-placements were needed prior to discovery of each board's algorithmically-first solution. Of course the rate of queen placements depends on CPU performance, but given a quick test-run on a particular CPU and a particular board size, it's an easy matter to calculate how long it took to solve one of these 'found' solutions.

For example, on an Intel Pentium D 3.4GHz CPU, using a single CPU thread -

  • For N=35 my program 'placed' 24 million queens per second and took just 6 minutes to find the first solution.
  • For N=47 my program 'placed' 20.5 million queens per second and took 199 days.

My current 2.8GHz i7-860 is thrashing through about 28.6 million queens per second, trying to find the first solution for N=48. So far it has taken over 550 days (theoretically, if it had never been uninterrupted) to UNsuccessfully place 1,369,331,731,000,000 (and rapidly climbing) queens.

My web site doesn't (yet) show any C++ code, but I do give a link on that web page to my simple illustration of each one of the 15 algorithm steps needed to solve the N=5 board.

It's a delicious puzzle indeed!

CSPea
  • 101
  • 1
  • 4
5

Which Prolog system are you using? For example, with recent versions of SWI-Prolog, you can readily find solutions for N=80 and N=100 within fractions of a second, using your original code. Many other Prolog systems will be much faster than that.

The N-queens problem is even featured in one of the online examples of SWI-Prolog, available as CLP(FD) queens in SWISH.

Example with 100 queens:

?- time((n_queens(100, Qs), labeling([ff], Qs))).
Qs = [1, 3, 5, 57, 59 | ...] .
2,984,158 inferences, 0.299 CPU in 0.299 seconds (100% CPU, 9964202 Lips)

SWISH also shows you nices image of solutions.

Here is an animated GIF showing the complete solution process for N=40 queens with SWI-Prolog:

CLP(FD) solution process for 40 queens

mat
  • 40,498
  • 3
  • 51
  • 78
  • 1
    ... The complete solution process? I think it shows only the labeling process for the first solution – false Mar 31 '16 at 18:18
4

a short solution presented by raymond hettinger at pycon: easy ai in python

#!/usr/bin/env python
from itertools import permutations
n = 12
cols = range(n)
for vec in permutations(cols):
    if (n == len(set(vec[i] + i for i in cols))
          == len(set(vec[i] - i for i in cols))):
        print vec

computing all permutations is not scalable, though (O(n!))

miku
  • 181,842
  • 47
  • 306
  • 310
  • Amazingly short! Any word on execution times? – Carl Smotricz Dec 07 '09 at 23:17
  • 6
    I don't know Python but it looks like you are using generate and test strategy. In which you just generate a permutation and check whether it satisfies the constraints. But the catch is even for small N like N=60 the number of permutations is 60!..Thts a big big number –  Dec 07 '09 at 23:23
  • yes, this solution is less about scalability, more about solving the classic 8x8 problem in a few lines .. – miku Dec 07 '09 at 23:28
  • But you could use contraints to reduce your search space. For example if you place a queen then strike off the positions that this queen will kill, and then place the other queens similarly. (Like what i have done in my program). Anyways amazingly short implementation :) –  Dec 07 '09 at 23:33
  • The reason selecting this as an answer is its close to what I had asked. Though still I am eager to see better implementations. –  Dec 14 '09 at 02:06
  • 3
    It's amazingly short because it includes Permutations class. I could write even shorter one that would just say: print queens(12). Now, that would be amazing ;) – Milan Babuškov Nov 10 '10 at 11:08
4

As to what is the largest N solved by computers there are references in literature in which a solution for N around 3*10^6 has been found using a conflict repair algorithm (i.e. local search). See for example the classical paper of [Sosic and Gu].

As to exact solving with backtracking,there exist some clever branching heuristics which achieve correct configurations with almost no backtracking. These heuristics can also be used to find the first-k solutions to the problem: after finding an initial correct configuration the search backtracks to find other valid configurations in the vicinity.

References for these almost perfect heuristics are [Kale 90] and [San Segundo 2011]

chesslover
  • 347
  • 2
  • 6
  • Not 100% backtrack free but almost. For example in [San Segundo 2011] 0 backtracks were required to find a first valid configuration for 992 cases out of 997 possible values of N from 4 up to 1000. For N= 15 6 backtracks were required, 8 for N=14 and 7 totalling for values of N={8, 11,13} – chesslover Jul 31 '14 at 16:18
  • Yes I was referring to the first solution as well. The branching heuristic was perfect (required no backtracking, so that answers your question) for 992 cases out of 997 values of N (N ranging from 4 to 1000). But for N={8, 11, 13, 14 , 15} some small backtracking was required. I am not aware of any branching heuristic which is perfect for all N. – chesslover Jul 31 '14 at 18:37
  • 2
    "[Explicit solutions exist for placing n queens on an n × n board for all n ≥ 4, requiring no combinatorial search whatsoever](http://en.wikipedia.org/wiki/Eight_queens_puzzle#Explicit_solutions)." Thus: simply order variables such that the first attempt will be a solution. However, should those variables be additionally constrained such that this very solution is excluded, one will get very bad performance. – false Aug 01 '14 at 08:17
  • ... it is for this reason that the first-k solutions with k > 1 might be more interesting. – false Aug 01 '14 at 08:26
  • Agreed, but your question was related to the first solution for CONSTRUCTIVE algorithms. As to the first k solutions (where no closed form solution is known, as you pointed out) there is also interesting info in the San Segundo paper. In particular, for k=100, the majority of backtracks is just around 200. It goes up to 1200 aprox. for small values of N. – chesslover Aug 01 '14 at 10:01
  • Isn't this now more about what exactly is meant by constructive? For each explicit solution, there is a trivial constructive algorithm. – false Aug 01 '14 at 10:41
4

What is the maximum value of N for which a program can calculate the answer in reasonable amount of time? Or what is the largest N we have seen so far?

There is no limit. That is, checking for the validity of a solution is more costly than constructing one solution plus seven symmetrical ones.

See Wikipedia: "Explicit solutions exist for placing n queens on an n × n board for all n ≥ 4, requiring no combinatorial search whatsoever‌​.".

false
  • 10,264
  • 13
  • 101
  • 209
  • @SterlingVix: That is a misunderstanding on your side: Once you are able to construct a single solution, simply order the variables to be labeled in exactly that order. So a corresponding approach that finds all solutions is very close. – false Sep 07 '16 at 00:06
1

I dragged out an old Delphi program that counted the number of solutions for any given board size, did a quick modification to make it stop after one hit and I'm seeing an odd pattern in the data:

The first board that took over 1 second to solve was n = 20. 21 solved in 62 milliseconds, though. (Note: This is based off Now, not any high precision system.) 22 took 10 seconds, not to be repeated until 28.

I don't know how good the optimization is as this was originally a highly optimized routine from back when the rules of optimization were very different. I did do one thing very different than most implementations, though--it has no board. Rather, I'm tracking which columns and diagonals are attacked and adding one queen per row. This means 3 array lookups per cell tested and no multiplication at all. (As I said, from when the rules were very different.)

Now for some real insanity: 29 took 9 seconds. 30 took almost 6 minutes!

Loren Pechtel
  • 8,945
  • 3
  • 33
  • 45
  • It actually doesn't seem all that interesting. It has to do with the amount of backtracking required by your traversal pattern to get to the first solution. I bet if he fiddled around with his traversal ordering, those numbers would change a bunch again. – Mike Jun 04 '10 at 18:22
1

Actually constrained random walk (generate and test) like what bakore outlined is the way to go if you just need a handful of solutions because these can be generated rapidly. I did this for a class when I was 20 or 21 and published the solution in the Journal of Pascal, Ada & Modula-2, March 1987, "The Queens Problem Revisited". I just dusted off the code from that article today (and this is very inefficient code) and after fixing a couple of problems have been generating N=26 ... N=60 solutions.

Kirt Undercoffer
  • 591
  • 4
  • 11
  • Here is a link to the N=60 solution generated: http://www.kirtundercoffer.com/publications/q60-1.txt This is a 60 x 60 text file with no spaces between the empty spaces (denoted by 1) and the queens (denoted by 9). I have not exhaustively checked this result but I have confidence based on all the previous results that it is correct. – Kirt Undercoffer Jan 14 '11 at 03:34
  • Here's an N=80 solution: http://www.kirtundercoffer.com/publications/q80.txt also generated using a random walk. – Kirt Undercoffer Jan 14 '11 at 19:14
  • 2
    I also generated a N=100 Queens solution overnight - you can see it at kirtundercoffer.com/publications/q100.txt This solution was also generated using a random walk - one can read on the Net and certainly hear in lectures that this sort of approach won't generate solutions even for small N. But this is incorrect. Of course it won't generate all solutions but in fact a random walk approach might even be preferred over backtracking where N is large as backtracking will be memory intensive. – Kirt Undercoffer Jan 14 '11 at 19:18
0

If you only want 1 solution then it can be found greedily in linear time O(N). My code in python:

import numpy as np

n = int(raw_input("Enter n: "))

rs = np.zeros(n,dtype=np.int64)
board=np.zeros((n,n),dtype=np.int64)

k=0

if n%6==2 :

    for i in range(2,n+1,2) :
        #print i,
        rs[k]=i-1
        k+=1

    rs[k]=3-1
    k+=1
    rs[k]=1-1
    k+=1

    for i in range(7,n+1,2) :
        rs[k]=i-1
        k+=1

    rs[k]=5-1

elif n%6==3 :

    rs[k]=4-1
    k+=1

    for i in range(6,n+1,2) :
        rs[k]=i-1
        k+=1

    rs[k]=2-1
    k+=1

    for i in range(5,n+1,2) :

        rs[k]=i-1
        k+=1

    rs[k]=1-1
    k+=1
    rs[k]=3-1

else :

    for i in range(2,n+1,2) :

        rs[k]=i-1
        k+=1

    for i in range(1,n+1,2) :

        rs[k]=i-1
        k+=1

for i in range(n) :
    board[rs[i]][i]=1

print "\n"

for i in range(n) :

    for j in range(n) :

        print board[i][j],

    print

Here, however printing takes O(N^2) time and also python being a slower language any one can try implementing it in other languages like C/C++ or Java. But even with python it will get the first solution for n=1000 within 1 or 2 seconds.