4

I'm working on university scheduling problem and using simple genetic algorithm for this. Actually it works great and optimizes the objective function value for 1 hour from 0% to 90% (approx). But then the process getting slow down drammatically and it takes days to get the best solution. I saw a lot of papers that it is reasonable to mix other algos with genetiс one. Could you, please, give me some piece of advise of what algorithm can be mixed with genetic one and of how this algorithm can be applied to speed up the solving process. The main question is how can any heuristic can be applied to such complex-structured problem? I have no idea of how can be applied there, for instance, greedy heuristics.

Thanks to everyone in advance! Really appreciate your help!


Problem description:

  1. I have:

    • array filled by ScheduleSlot objects
    • array filled by Lesson objects
  2. I do:

    • Standart two-point crossover
    • Mutation (Move random lesson to random position)
    • Rough selection (select only n best individuals to next population)

Additional information for @Dougal and @izomorphius:
I'm triyng to construct a university schedule, which will have no breaks between lessons, overlaps and geographically distributed lessons for groups and professors.
The fitness function is really simple: fitness = -1000*numberOfOverlaps - 1000*numberOfDistrebutedLessons - 20*numberOfBreaks. (or something like that, we can simply change coefficients in fron of the variables)
At the very beggining I generate my individuals just placing lessons in random room, time and day.
Mutation and crossover, as described above, a really trivial:

  1. Crossover - take to parent schedules, randomly choose the point and the range of crossover and just exchange the parts of parent schedules, generating two child schedules.
  2. Mutation - take a child schedule and move n random lessons to random position.
Stewart
  • 4,223
  • 6
  • 31
  • 39
mr.nothing
  • 5,141
  • 10
  • 53
  • 77
  • 2
    This isn't something I've done before or know if is standard, but you could consider running the GA for a while, getting out a population with decent fitness, and then using those points as the start points for a beam search or something that'd use standard local search techniques. – Danica Apr 27 '12 at 12:47
  • i have successfully mixed genetic algorithm with dynamic programming, neural networks and even some data mining algorithms. But without knowing more about what you try to do, what your fit function is, how you create individuals and how you mutate I don't think I can give you any advice on how to apply those for your problem. – Ivaylo Strandjev Apr 27 '12 at 13:03
  • @izomorphius, please, find the additional information in corrected topic, thanks in advance for your help – mr.nothing Apr 29 '12 at 16:09
  • @Dougal, please, find the additional information in corrected topic, thanks in advance for your help – mr.nothing Apr 29 '12 at 16:10
  • 1
    You could try to add simulated annealing (probably respecting the top N individuals) if the fitness fails to improve after X iterations: increase the heat, and shake things up. – wildplasser Apr 29 '12 at 16:41
  • You say "*..no breaks between lessons, overlaps and geographically distributed lessons for groups and professors.*" I know what a professor is, but what's a "*group*" in this context? – RBarryYoung Apr 29 '12 at 17:20
  • @RBarryYoung, sorry, the meaning of "group" here is a group of students. That's specific for Russia and other ex-USSR countries, i guess. – mr.nothing Apr 30 '12 at 09:57

2 Answers2

4

My initial observation: you have chosen the coefficients in front of the numberOfOverlaps, numberOfDistrebutedLessons and numberOfBreaks somewhat randomly. My experience shows that usually these choices are not the best one and you should better let the computer choose them. I propose writing a second algorithm to choose them - could be neural network, second genetic algorithm or a hill climbing. The idea is - compute how good a result you get after a certain amount of time and try to optimize the choice of these 3 values.

Another idea: after getting the result you may try to brute-force optimize it. What I mean is the following - if you had the initial problem the "silly" solution would be back track that checks all the possibilities and this is usually done using dfs. Now this would be very slow, but you may try using depth first search with iterative deepening or simply a depth restricted DFS.

Al Fahad
  • 2,378
  • 5
  • 28
  • 37
Ivaylo Strandjev
  • 69,226
  • 18
  • 123
  • 176
  • Totally agree. You can't learn something by smashing your head into wall. As well as GA can't learn the rules with such ridiculously big difference between positive and negative fitness function terms. – Agnius Vasiliauskas Apr 30 '12 at 09:25
  • @0x69, i'm not sure that it is necessary to find the coefficients. Setting them by ourselfs we just trying to highlight what criteria is more essential for us and what is less. Anyway, I gonna try bruteforce on the best solution which I'll get from GA. Hope it will bring good results. Thanks, everybody, I'll provide some info about results. – mr.nothing Apr 30 '12 at 10:11
  • @izomorphius, well, I wrote a brute-force optimizer, which works after GA, and as I thought, the most difficult problem was to remove gaps between lessons for groups and for professors simultaneously, it works pretty slow. I really thought, that hybrid GA can be more productive. – mr.nothing May 02 '12 at 18:53
0

For many problems, I find that a Lamarckian-style of GA works well, combining a local search into the GA algorithm.

For your case, I would try to introduce a partial systematic search as the local search. There are two obvious ways to do this, and you should probably try both.

  1. Alternate GA iterations with local search iterations. For your local search you could, for example, brute force all the lessons assigned in a single day while leaving everything else unchanged. Another possibility is to move a randomly selected lesson to all free slots to find the best choice for that. The key is to minimise the cost of the brute-search while still having the chance to find local improvements.

  2. Add a new operator alongside mutation and crossover that performs your local search. (You might find that the mutation operator is less useful in the hybrid scheme, so just replacing that could be viable.)

In essence, you will be combining the global exploration of the GA with an efficient local search. Several GA frameworks include features to assist in this combination. For example, GAUL implements the alternate scheme 1 above, with either the full population or just the new offspring at each iteration.

Stewart
  • 4,223
  • 6
  • 31
  • 39