Skip to content

golden_section

GoldenSection

Bases: Search

A search class that performs the golden-section search on a single variable.

Golden-section search is good at finding minimal or maximal values of a unimodal function. Each search step reduces the search range by a constant factor: the golden ratio. More details are available at: https://en.wikipedia.org/wiki/Golden-section_search.

search = GoldenSection(score_fn=lambda search_idx, n: (n - 3)**2, x_min=0, x_max=6, max_iter=10, best_mode="min")
search.fit()
print(search.get_best_parameters()) # {"n": 3, "search_idx": 2}

Parameters:

Name Type Description Default
score_fn Callable[[int, Union[int, float]], float]

Objective function that measures search fitness. One of its arguments must be 'search_idx' which will be automatically provided by the search routine. This can help with file saving / logging during the search. The other argument should be the variable to be searched over.

required
x_min Union[int, float]

Lower limit (inclusive) of the search space.

required
x_max Union[int, float]

Upper limit (inclusive) of the search space.

required
max_iter int

Maximum number of iterations to run. The range at a given iteration i is 0.618**i * (x_max - x_min). Note that the scoring function will always be evaluated twice before any iterations begin.

required
integer bool

Whether the optimized variable is a discrete integer.

True
best_mode str

Whether maximal or minimal fitness is desired. Must be either 'min' or 'max'.

'max'
name str

The name of the search instance. This is used for saving and loading purposes.

'golden_section_search'

Raises:

Type Description
AssertionError

If score_fn, x_min, x_max, or max_iter are invalid.

Source code in fastestimator\fastestimator\search\golden_section.py
class GoldenSection(Search):
    """A search class that performs the golden-section search on a single variable.

    Golden-section search is good at finding minimal or maximal values of a unimodal function. Each search step reduces
    the search range by a constant factor: the golden ratio. More details are available at:
    https://en.wikipedia.org/wiki/Golden-section_search.

    ```python
    search = GoldenSection(score_fn=lambda search_idx, n: (n - 3)**2, x_min=0, x_max=6, max_iter=10, best_mode="min")
    search.fit()
    print(search.get_best_parameters()) # {"n": 3, "search_idx": 2}
    ```

    Args:
        score_fn: Objective function that measures search fitness. One of its arguments must be 'search_idx' which will
            be automatically provided by the search routine. This can help with file saving / logging during the search.
            The other argument should be the variable to be searched over.
        x_min: Lower limit (inclusive) of the search space.
        x_max: Upper limit (inclusive) of the search space.
        max_iter: Maximum number of iterations to run. The range at a given iteration i is 0.618**i * (x_max - x_min).
            Note that the scoring function will always be evaluated twice before any iterations begin.
        integer: Whether the optimized variable is a discrete integer.
        best_mode: Whether maximal or minimal fitness is desired. Must be either 'min' or 'max'.
        name: The name of the search instance. This is used for saving and loading purposes.

    Raises:
        AssertionError: If `score_fn`, `x_min`, `x_max`, or `max_iter` are invalid.
    """
    def __init__(self,
                 score_fn: Callable[[int, Union[int, float]], float],
                 x_min: Union[int, float],
                 x_max: Union[int, float],
                 max_iter: int,
                 integer: bool = True,
                 best_mode: str = "max",
                 name: str = "golden_section_search"):
        super().__init__(score_fn=score_fn, best_mode=best_mode, name=name)
        assert x_min < x_max, "x_min must be smaller than x_max"
        if integer:
            assert isinstance(x_min, int) and isinstance(x_max, int), \
                "x_min and x_max must be integers when searching in integer mode"
        args = set(inspect.signature(score_fn).parameters.keys()) - {'search_idx'}
        assert len(args) == 1, "the score function should only contain one argument other than 'search_idx'"
        assert max_iter > 0, "max_iter should be greater than 0"
        self.x_min = x_min
        self.x_max = x_max
        self.max_iter = max_iter
        self.integer = integer
        self.arg_name = args.pop()

    def _convert(self, x: Union[float, int]) -> Union[int, float]:
        return int(x) if self.integer else x

    def _is_better(self, a: float, b: float) -> bool:
        return a > b if self.best_mode == "max" else a < b

    def _fit(self):
        a, b, h = self.x_min, self.x_max, self.x_max - self.x_min
        invphi, invphi2 = (math.sqrt(5) - 1) / 2, (3 - math.sqrt(5)) / 2
        c, d = self._convert(a + invphi2 * h), self._convert(a + invphi * h)
        yc = self.evaluate(**{self.arg_name: c})
        yd = self.evaluate(**{self.arg_name: d})
        for _ in range(self.max_iter):
            if self._is_better(yc, yd):
                b, d, yd = d, c, yc
                h = invphi * h
                c = self._convert(a + invphi2 * h)
                yc = self.evaluate(**{self.arg_name: c})
            else:
                a, c, yc = c, d, yd
                h = invphi * h
                d = self._convert(a + invphi * h)
                yd = self.evaluate(**{self.arg_name: d})
        best_results = self.get_best_results()
        print("FastEstimator-Search: Golden Section Search Finished, best parameters: {}, best score: {}".format(
            best_results[0], best_results[1]))