<?xml version="1.0" encoding="ISO-8859-1"?>
<metadatalist>
	<metadata ReferenceType="Journal Article">
		<site>plutao.sid.inpe.br 800</site>
		<holdercode>{isadg {BR SPINPE} ibi 8JMKD3MGPCW/3DT298S}</holdercode>
		<identifier>8JMKD3MGP3W/3P5TDP2</identifier>
		<repository>sid.inpe.br/plutao/2017/06.21.20.42</repository>
		<lastupdate>2017:06.23.12.17.27 dpi.inpe.br/plutao@80/2008/08.19.15.01 administrator</lastupdate>
		<metadatarepository>sid.inpe.br/plutao/2017/06.21.20.42.17</metadatarepository>
		<metadatalastupdate>2021:01.02.22.16.59 sid.inpe.br/bibdigital@80/2006/04.07.15.50 administrator</metadatalastupdate>
		<doi>10.1155/2017/8042436</doi>
		<issn>2356-752X</issn>
		<label>lattes: 8920905542032636 1 BarbosaSenn:2017:ApCoDe</label>
		<citationkey>BarbosaSenn:2017:ApCoDe</citationkey>
		<title>Improving the fine-tuning of metaheuristics: an approach combining design of experiments and racing algorithms</title>
		<year>2017</year>
		<typeofwork>journal article</typeofwork>
		<secondarytype>PRE PI</secondarytype>
		<numberoffiles>1</numberoffiles>
		<size>2241 KiB</size>
		<author>Barbosa, Eduardo Batista de Moraes,</author>
		<author>Senne, Edson Luiz França,</author>
		<group>DIDOP-CGCPT-INPE-MCTIC-GOV-BR</group>
		<affiliation>Instituto Nacional de Pesquisas Espaciais (INPE)</affiliation>
		<affiliation>Universidade Estadual Paulista (UNESP)</affiliation>
		<electronicmailaddress>eduardo.barbosa@inpe.br</electronicmailaddress>
		<journal>Journal of Optimization</journal>
		<volume>2017</volume>
		<pages>1-7</pages>
		<transferableflag>1</transferableflag>
		<contenttype>External Contribution</contenttype>
		<versiontype>publisher</versiontype>
		<keywords>Metaheuristics, Fine-tuning, Combinatorial optimization, Nonparametric statistics.</keywords>
		<abstract>Usually, metaheuristic algorithms are adapted to a large set of problems by applying few modifications on parameters for each specific case. However, this flexibility demands a huge effort to correctly tune such parameters. Therefore, the tuning of metaheuristics arises as one of the most important challenges in the context of research of these algorithms.Thus, this paper aims to present a methodology combining Statistical andArtificial Intelligencemethods in the fine-tuning ofmetaheuristics.Thekey idea is a heuristic method, called Heuristic Oriented Racing Algorithm (HORA), which explores a search space of parameters looking for candidate configurations close to a promising alternative. To confirm the validity of this approach, we present a case study for finetuning two distinct metaheuristics: Simulated Annealing (SA) and Genetic Algorithm (GA), in order to solve the classical traveling salesman problem. The results are compared considering the same metaheuristics tuned through a racing method. Broadly, the proposed approach proved to be effective in terms of the overall time of the tuning process. Our results reveal that metaheuristics tuned by means of HORA achieve, with much less computational effort, similar results compared to the case when they are tuned by the other fine-tuning approach.</abstract>
		<area>MET</area>
		<language>en</language>
		<targetfile>barbosa_improving.pdf</targetfile>
		<usergroup>lattes</usergroup>
		<readergroup>administrator</readergroup>
		<readergroup>lattes</readergroup>
		<visibility>shown</visibility>
		<readpermission>deny from all and allow from 150.163</readpermission>
		<documentstage>not transferred</documentstage>
		<nexthigherunit>8JMKD3MGPCW/43SQKNE</nexthigherunit>
		<dissemination>WEBSCI; PORTALCAPES.</dissemination>
		<hostcollection>dpi.inpe.br/plutao@80/2008/08.19.15.01</hostcollection>
		<username>simone</username>
		<lasthostcollection>dpi.inpe.br/plutao@80/2008/08.19.15.01</lasthostcollection>
		<url>http://plutao.sid.inpe.br/rep-/sid.inpe.br/plutao/2017/06.21.20.42</url>
	</metadata>
</metadatalist>