Jump to content

Draft:Fixed set search

From Wikipedia, the free encyclopedia
  • Comment: Possible COI issues. The majority of the sources appear to be written by the same author. Shadow311 (talk) 00:28, 24 April 2024 (UTC)

The fixed set search (FSS) is a metaheuristic that has been effectively applied to various combinatorial optimization problems, including the traveling salesman problem (TSP)[1], machine scheduling[2], clique partition[3], interconnected facility problem[4] and minimum weighted vertex cover[5]. Additionally, the FSS has been adapted for bi-objective optimization[5] and a matheuristic setting[6]. As a population-based metaheuristic integrating a local search, FSS is particularly effective for problems where local search methods excel. FSS enhances the Greedy randomized adaptive search procedure (GRASP) metaheuristic by adding a learning mechanism to it. The GRASP involves iteratively generating solutions using a randomized greedy algorithm and refining each of the solutions with a local search.

The FSS is inspired by the observation that high-quality solutions often share common elements. The core strategy of FSS is to identify these shared elements—termed the "fixed set"—and incorporate them into newly generated solutions. This approach focuses computational efforts on completing these partial solutions by "filling in the gaps." This concept draws on earlier metaheuristic strategies such as chunking[7] and vocabulary building[8]. The idea of observing a solution as a set of elements has also been used in the ant colony optimization and worm optimization[9] metaheuristics. Note that matheuristic techniques, such as Kernel Search[10] and Heuristic Concentration[11], employ a similar strategy of generating numerous solutions to identify common elements in high-quality ones.

Method outline

[edit]

The FSS algorithm begins by initializing a population of solutions using GRASP, then employs a learning mechanism to generate fixed sets. These fixed sets are iteratively utilized in an enhanced randomized greedy algorithm, which incorporates a predefined selection of elements into each newly generated solution. Each new solution undergoes a local search to improve its quality. Apart from implementing the corresponding GRASP algorithm, several building blocks are necessary for FSS implementation.

[edit]

The initial building blocks of the FSS are the same as for the GRASP. The first one is a greedy algorithm that can be randomized. The second one is a local search that can be applied to solutions generated in this way. The greedy algorithm should also be adapted to be able to generate solutions containing a pre-selected set of elements. A straightforward example of a greedy algorithm for the minimum vertex cover problem, using a preselected set of elements, might look like this: Rather than starting with an empty set as the initial partial solution, the algorithm begins with a predetermined fixed set.

Ground set

[edit]

The next step is to use a ground set based formulation of the combinatorial optimization problem of interest. More precisely, any problem solution must be represented as a subset derived from a ground set of elements . It should be noted that this type of formulation should be possible for any combinatorial optimization problem[12]. For example, in the case of the TSP, defined on a graph, the ground set is the set of all edges. In the case of the minimum vertex cover problem, the ground set is the set of all vertices of the graph.

Generation of fixed sets

[edit]

The third building block involves defining a method for generating multiple fixed sets from a population of solutions . This method should generate sets that contain elements that frequently occur in high quality solution. This method should be able to control the size of the fixed set (). When integrating a fixed set into the randomized greedy algorithm, it must be capable of producing a feasible solution of equal or superior quality to those in the current population of solutions . In the following text the method for generating such fixed sets is given.

Let represent the set of best (based on the objective function) solutions generated in previous steps. A base solution is a randomly selected solution from these best solutions. If a fixed set satisfies , it can potentially generate a feasible solution of at least the same quality as , and can include any number of elements from . The objective is for to contain elements that frequently appear across a collection of high-quality solutions. In relation, let us define as the set of randomly selected solutions from the best ones, .

Using these building blocks it is possible to define a function for generating a fixed set that consists of elements of the base solution that most frequently occur in the set of test solutions . Let use define the function , for an element and a solution , which is equal to if and otherwise. We can define a function that counts the number of occurrences of an element in the elements of the set using the function as follows.

Now, we can define as the set of elements that have the largest value of . A graphical illustration of the method for generating fixed sets can be seen in following figure.

A graphical illustration of the method for generating fixed sets for the minimum vertex cover problem. It is assumed that the fixed set is selected for the best solutions (), test solutions () are used and the size of the fixed set is . On the left side each solution corresponds to the blue colored vertices. In the base solution () the values in the blue vertices are equal to the number of occurrences of these vertices in the set of test solutions (values of ). On the right side is the selected fixed set (), dashed blue vertices, is equal to elements that occur the most frequently in the test set of solutions .

Learning mechanism

[edit]

The learning mechanism in the FSS is facilitated through fixed sets. To achieve this, the randomized greedy algorithm from the GRASP is adapted to incorporate preselected elements that must be included in newly generated solutions. We denote the solution generated with such an randomized greedy algorithm with a fixed (preselected) set of elements as .

In FSS, akin to GRASP, solutions are iteratively generated and subjected to local search. Initially, a population of solutions is created by executing iterations of the GRASP algorithm. This population is then used to randomly generate a fixed set of size , following the method outlined in the preceding section. is utilized to produce a new solution , upon which local search is applied. The population is expanded with these newly generated locally optimal solutions. This cycle continues until no new best solutions are discovered for an extended period, signifying stagnation. At this point, the size of the fixed set may be increased if stagnation persists. If the maximum allowed size is reached, the fixed set size resets to the minimum allowed value. This iterative process continues until a stopping criterion is met.

An essential aspect of the algorithm is defining the array of permissible fixed set sizes, tied to the fixed part of the solution. In our implementation, this array is defined asThe size of fixed sets used is proportional to the base solution , specifically at the -th level.

Algorithm

[edit]

The FSS uses the following set of parameters:

  • the number of iterations performed by the GRASP to generate the initial population of solutions
  • The number of best solutions , related to , that is considered in generating the fixed set.
  • The number of test solutions , related to , used when generating the fixed set
  • The number of iterations () without change to after which the FSS is considered stagnant.
  • Parameters related to the randomization of the greedy algorithm

The FSS is best understood by observing its pseudocode.

Initialize Coefs
Coef = Coefs[1]
Generate initial population P using GRASP(N)
while (Not termination condition) do
    Set P_kn to random k elements of P_n
    Set B to a random solution in P_n
    F = Fix(B, P_kn, Coef*|B|)
    S = RGF(F)
    Apply local search to S
    P = P  {S}
    if no change to P_n in Stag iterations then
        Coef = Coefs.Next
    end if
end while

The first step involves initializing the sizes of fixed sets using using (2), setting the current fixed set size, , to its minimum value. The initialization stage proceeds by generating an initial population through iterations of the basic GRASP algorithm.

Each iteration within the main loop comprises several steps. Initially, a random set of solutions, , is formed by selecting elements from , and a random base solution, , is chosen from the set . Subsequently, the function is employed to create a fixed set, . Using , a new solution, , is generated utilizing the randomized greedy algorithm with preselected elements, followed by applying a local search to . We then assess whether is the new best solution and include it in the set of generated solutions, . If stagnation is detected, is incremented to the next value in , with being the next larger element in the array. If has already reached its maximum, we reset it to the smallest element in . This process continues until a termination criterion is met.

See also

[edit]

References

[edit]
  1. ^ Jovanovic, Raka; Tuba, Milan; Voß, Stefan (2019), Blesa Aguilera, Maria J.; Blum, Christian; Gambini Santos, Haroldo; Pinacho-Davidson, Pedro (eds.), "Fixed Set Search Applied to the Traveling Salesman Problem", Hybrid Metaheuristics, vol. 11299, Cham: Springer International Publishing, pp. 63–77, arXiv:1809.04942, doi:10.1007/978-3-030-05983-5_5, ISBN 978-3-030-05982-8, retrieved 2024-04-19
  2. ^ Jovanovic, Raka; Voß, Stefan (2021-10-01). "Fixed set search application for minimizing the makespan on unrelated parallel machines with sequence-dependent setup times". Applied Soft Computing. 110: 107521. doi:10.1016/j.asoc.2021.107521. ISSN 1568-4946.
  3. ^ Jovanovic, Raka; Sanfilippo, Antonio P.; Voß, Stefan (2023-08-16). "Fixed set search applied to the clique partitioning problem". European Journal of Operational Research. 309 (1): 65–81. doi:10.1016/j.ejor.2023.01.044. ISSN 0377-2217.
  4. ^ Lozano-Osorio, Isaac; Sánchez-Oro, Jesús; Martínez-Gavara, Anna; López-Sánchez, Ana D.; Duarte, Abraham (2023). "An Efficient Fixed Set Search for the Covering Location with Interconnected Facilities Problem". In Di Gaspero, Luca; Festa, Paola; Nakib, Amir; Pavone, Mario (eds.). Metaheuristics. Lecture Notes in Computer Science. Vol. 13838. Cham: Springer International Publishing. pp. 485–490. doi:10.1007/978-3-031-26504-4_37. ISBN 978-3-031-26504-4.
  5. ^ a b Jovanovic, Raka; Sanfilippo, Antonio P.; Voß, Stefan (2022-08-01). "Fixed set search applied to the multi-objective minimum weighted vertex cover problem". Journal of Heuristics. 28 (4): 481–508. doi:10.1007/s10732-022-09499-z. ISSN 1572-9397.
  6. ^ Jovanovic, Raka; Voß, Stefan (2024-02-12). "Matheuristic fixed set search applied to the multidimensional knapsack problem and the knapsack problem with forfeit sets". OR Spectrum. 46 (4): 1329–1365. doi:10.1007/s00291-024-00746-2. ISSN 1436-6304.
  7. ^ Woodruff, David L (1998-04-16). "Proposals for chunking and tabu search". European Journal of Operational Research. 106 (2): 585–598. doi:10.1016/S0377-2217(97)00293-2. ISSN 0377-2217.
  8. ^ Sondergeld, Lutz; Voß, Stefan (1999), Voß, Stefan; Martello, Silvano; Osman, Ibrahim H.; Roucairol, Catherine (eds.), "Cooperative Intelligent Search Using Adaptive Memory Techniques", Meta-Heuristics: Advances and Trends in Local Search Paradigms for Optimization, Boston, MA: Springer US, pp. 297–312, doi:10.1007/978-1-4615-5775-3_21, ISBN 978-1-4615-5775-3, retrieved 2024-05-10
  9. ^ Arnaout, Jean-Paul (2020-02-01). "A worm optimization algorithm to minimize the makespan on unrelated parallel machines with sequence-dependent setup times". Annals of Operations Research. 285 (1): 273–293. doi:10.1007/s10479-019-03138-w. ISSN 1572-9338.
  10. ^ Angelelli, Enrico; Mansini, Renata; Grazia Speranza, M. (2010-11-01). "Kernel search: A general heuristic for the multi-dimensional knapsack problem". Computers & Operations Research. Metaheuristics for Logistics and Vehicle Routing. 37 (11): 2017–2026. doi:10.1016/j.cor.2010.02.002. ISSN 0305-0548.
  11. ^ Rosing, K. E.; ReVelle, C. S. (1997-02-16). "Heuristic concentration: Two stage solution construction". European Journal of Operational Research. 97 (1): 75–86. doi:10.1016/S0377-2217(96)00100-2. ISSN 0377-2217.
  12. ^ Dror, Moshe; Orlin, James B. (January 2008). "Combinatorial Optimization with Explicit Delineation of the Ground Set by a Collection of Subsets". SIAM Journal on Discrete Mathematics. 21 (4): 1019–1034. doi:10.1137/050636589. ISSN 0895-4801.