Publication type: Proceedings Article
Publication date: 2016-07-20
Abstract
Hyper-heuristics have been used widely to solve optimisation problems, often single-objective and discrete in nature. Herein, we extend a recently-proposed selection hyper-heuristic to the multi-objective domain and with it optimise continuous problems. The MOSSHH algorithm operates as a hidden Markov model, using transition probabilities to determine which low-level heuristic or sequence of heuristics should be applied next. By incorporating dominance into the transition probability update rule, and an elite archive of solutions, MOSSHH generates solutions to multi-objective problems that are competitive with bespoke multi-objective algorithms. When applied to test problems, it is able to find good approximations to the true Pareto front, and yields information about the type of low-level heuristics that it uses to solve the problem.
Found
Nothing found, try to update filter.
Found
Nothing found, try to update filter.
Top-30
Journals
|
1
2
|
|
|
Lecture Notes in Computer Science
2 publications, 12.5%
|
|
|
Information Sciences
1 publication, 6.25%
|
|
|
IEEE Computational Intelligence Magazine
1 publication, 6.25%
|
|
|
IEEE Transactions on Automation Science and Engineering
1 publication, 6.25%
|
|
|
Springer Optimization and Its Applications
1 publication, 6.25%
|
|
|
IEEE Transactions on Systems, Man, and Cybernetics: Systems
1 publication, 6.25%
|
|
|
Expert Systems with Applications
1 publication, 6.25%
|
|
|
Studies in Computational Intelligence
1 publication, 6.25%
|
|
|
Communications in Computer and Information Science
1 publication, 6.25%
|
|
|
IEEE Access
1 publication, 6.25%
|
|
|
1
2
|
Publishers
|
1
2
3
4
5
6
|
|
|
Institute of Electrical and Electronics Engineers (IEEE)
6 publications, 37.5%
|
|
|
Springer Nature
5 publications, 31.25%
|
|
|
Elsevier
2 publications, 12.5%
|
|
|
Association for Computing Machinery (ACM)
1 publication, 6.25%
|
|
|
IntechOpen
1 publication, 6.25%
|
|
|
1
2
3
4
5
6
|
- We do not take into account publications without a DOI.
- Statistics recalculated weekly.
Are you a researcher?
Create a profile to get free access to personal recommendations for colleagues and new articles.
Metrics
16
Total citations:
16
Citations from 2024:
3
(18.75%)