While trying to explain recursive algorithms to generate permutations in lexicographic order, I provided this simple algorithm:
def permute(done, remaining):
if not remaining:
print done
return
sorted_rem = sorted(remaining)
l = len(sorted_rem)
for i in xrange(0, l):
c = sorted_rem[i]
# Move to c to done portion.
done.append(c)
remaining.remove(c)
# Permute the remaining
permute(done, remaining)
# Put c back.
remaining.append(c)
# Remove from done.
del done[-1]
def main():
permute([], [1,2,3,4])
if __name__ == "__main__":
main()
First question: It seems a bit wasteful to me and I wonder what the time complexity it really has, given a pythonic implementation like this?
Note that the optimal time complexity will be O(n * n!)
, as we need to print n! permutations of size n.
I am guessing because of the sorted (which I presume is O(n log n)
in python), an additional log n
factor will be added (which I suppose is practically negligible for the n
we can use this program for).
The second part of the question is to optimize it a bit.
Second question: Assuming that we can implement
sorted
inO(n)
time, andappend
,remove
anddel[-1]
inO(1)
time, what would be the resulting time complexity?