GenOpt® is an optimization program for the minimization of a cost function that is evaluated by an external simulation program, such as EnergyPlus, TRNSYS, Dymola, IDA-ICE or DOE-2. You may find more information in:
http://simulationresearch.lbl.gov/GO/
This page will include information and source code implementation of optimization algorithms that I’ve built for use in research projects. Two packages of evolutionary algorithms are coupled with GenOpt, the ECJ and JGAP. You may find more information about ECJ and JGAP in:
http://cs.gmu.edu/~eclab/projects/ecj/
The pattern search of Hooke and Jeeves is modified to work in a synchronously parallel version as well. I intend to post details on their operation and tutorial examples in a different post.
1) The first and probably the most useful implementation is the simple genetic algorithm. The Java Genetic Algorithms Package (JGAP) is coupled with GenOpt to provide the functionality of the typical genetic algorithm operation (best chromosomes selector-mutation-crossover). This version has been used in many of my research studies and it’s robust and mature. I’ve even modified some of JGAP’s functionality to work more efficiently utilizing multiple threads. It can include one suggested solution in the initial population too. The major disadvantage of this algorithm implementation is the lack of ability to handle negative values of the objective function. I’ve modified the JGAP version to handle such operation, but the risk of breakage some other functionality of JGAP and backward compatibility issues couldn’t let the implementation in the base package. You have to work with the current version of JGAP (3.6.2.), so you should ensure that the objective function doesn’t result in negative values. You may download the compiled files of the simple genetic algorithm implementation here:
2) The second algorithm is the modified version of Hooke and Jeeves. This is a pattern search algorithm which doesn’t utilize the multithreaded capability. An asynchronous parallel operation of such an algorithm would improve the efficiency, but the current version of GenOpt couldn’t let implementing such an architecture. So I built a synchronously parallel version. This version searches all directions in a multithreaded way before it updates its current position. I’ve used it for problems where the number of the design variables is twice as bigger than the number of the computer threads and it worked faster especially when the start point were near optimal. It is expected to have an even better operation for problems where the number of threads is bigger than the number of design variables. This version replaces the current implemented version in GenOpt and you choose the desired operation via a keyword in the command file. You can use this version with the hybrid Particle Swarm algorithm or the genetic algorithm. You may download the compiled file of the synchronously parallel version of Hooke-Jeeves here:
3) The next algorithm is the obvious connection of the genetic algorithm with the modified Hooke-Jeeves. This hybrid operation is equivalent to the current PSO+Hooke-Jeeves implementation. The genetic algorithm global searches for a near optimal point and the result is used as a starting point for Hooke-Jeeves for refinement. Discrete variables are set to a constant value as Hooke-Jeeves works only on continuous variables. You may download the compiled file of the hybrid genetic algorithm with Hooke-Jeeves here:
GeneticAlgorithmJGAPGPSHJ.class
4) The ECJ coupling implementation has provided 2 more algorithms. The first is another simple genetic algorithm. This still is in beta version (my coupling code is in beta, not ECJ) and it probably will stay that way, as the JGAP implementation is good enough. The second is an interesting implementation of the NSGA-II which is a multiobjective genetic algorithm. So GenOpt is able to work for multiobjective (or multi-criteria) problems as well! This is a beta version as well, but a public version will be available soon and it will stay that way because I indent to start from scratch implementing NSGA-II to a new optimization package. Let me know if you need any help utilizing multiobjective algorithms to a research project.
Java programmers will find their way easy (build a project from genopt’s source code, adding the source files in the algorithms folder and add a reference to the jgap file). A very abstract description on how you make them work is: a) get GenOpt’s jar file and add the .class files in the algorithms folder (jar is a compressed file like zip). Then add a lib folder outside genopt.jar with the “jgap3.6.2.jar”. You should add a line in the MANIFEST.MF file to include the JGAP path as “Class-Path: lib/jgap3.6.2.jar”.
Use the above information at your own risk! I’m not responsible for any losses because of the use of the algorithms. These algorithms cannot ensure finding the optimal solution to a problem and the optimality is depended on the problem setup as well.
I’m interested participating to research projects utilizing optimization algorithms to improve building design and control efficiency.
whats the next post?