Search Neighborhood

14,000,000 Leading Edge Experts on the ideXlab platform

Scan Science and Technology

Contact Leading Edge Experts & Companies

Scan Science and Technology

Contact Leading Edge Experts & Companies

The Experts below are selected from a list of 267 Experts worldwide ranked by ideXlab platform

J. Jaime Gómez-hernández - One of the best experts on this subject based on the ideXlab platform.

  • One Step at a Time: The Origins of Sequential Simulation and Beyond
    Mathematical Geosciences, 2021
    Co-Authors: J. Jaime Gómez-hernández, R. Mohan Srivastava
    Abstract:

    In the mid-1980s, still in his young 40s, André Journel was already recognized as one of the giants of geostatistics. Many of the contributions from his new reSearch program at Stanford University had centered around the indicator methods that he developed: indicator kriging and multiple indicator kriging. But when his second crop of graduate students arrived at Stanford, indicator methods still lacked an approach to conditional simulation that was not tainted by what André called the ‘Gaussian disease’; early indicator simulations went through the tortuous path of converting all indicators to Gaussian variables, running a turning bands simulation, and truncating the resulting multi-Gaussian realizations. When he conceived of sequential indicator simulation (SIS), even André likely did not recognize the generality of an approach to simulation that tackled the simulation task one step at a time. The early enthusiasm for SIS was its ability, in its multiple-indicator form, to cure the Gaussian disease and to build realizations in which spatial continuity did not deteriorate in the extreme values. Much of Stanford’s work in the 1980s focused on petroleum geostatistics, where extreme values (the high-permeability fracture zones and the low-permeability shale barriers) have much stronger anisotropy, and much longer ranges of correlation in the maximum continuity direction, than mid-range values. With multi-Gaussian simulations necessarily imparting weaker continuity to the extremes, SIS was an important breakthrough. The generality of the sequential approach was soon recognized, first through its analogy with multi-variate unconditional simulation achieved using the lower triangular matrix of an LU decomposition of the covariance matrix as the multiplier of random normal deviates. Modifying LU simulation so that it became conditional gave rise to sequential Gaussian simulation (SGS), an algorithm that shared much in common with SIS. With nagging implementation details like the sequential path and the Search Neighborhood being common to both methods, improvements in either SIS or SGS often became improvements to the other. Almost half of the contributors to this Special Issue became students of André in the classes of 1984–1988, and several are the pioneers of SIS and SGS. Others who studied later with André explored and developed the first multipoint statistics simulation procedures, which are based on the same concept that underlies sequential simulation. Among his many significant intellectual accomplishments, one of the cornerstones of André Journel’s legacy was sequential simulation, built one step at a time.

  • Inverse sequential simulation: Performance and implementation details
    Advances in Water Resources, 2015
    Co-Authors: J. Jaime Gómez-hernández
    Abstract:

    Abstract For good groundwater flow and solute transport numerical modeling, it is important to characterize the formation properties. In this paper, we analyze the performance and important implementation details of a new approach for stochastic inverse modeling called inverse sequential simulation (iSS). This approach is capable of characterizing conductivity fields with heterogeneity patterns difficult to capture by standard multiGaussian-based inverse approaches. The method is based on the multivariate sequential simulation principle, but the covariances and cross-covariances used to compute the local conditional probability distributions are computed by simple co-kriging which are derived from an ensemble of conductivity and piezometric head fields, in a similar manner as the experimental covariances are computed in an ensemble Kalman filtering. A sensitivity analysis is performed on a synthetic aquifer regarding the number of members of the ensemble of realizations, the number of conditioning data, the number of piezometers at which piezometric heads are observed, and the number of nodes retained within the Search Neighborhood at the moment of computing the local conditional probabilities. The results show the importance of having a sufficiently large number of all of the mentioned parameters for the algorithm to characterize properly hydraulic conductivity fields with clear non-multiGaussian features.

Ramiro Varela - One of the best experts on this subject based on the ideXlab platform.

  • lateness minimization with tabu Search for job shop scheduling problem with sequence dependent setup times
    Journal of Intelligent Manufacturing, 2013
    Co-Authors: Miguel A. González, Camino R. Vela, Ines Gonzalezrodriguez, Ramiro Varela
    Abstract:

    We tackle the job shop scheduling problem with sequence dependent setup times and maximum lateness minimization by means of a tabu Search algorithm. We start by defining a disjunctive model for this problem, which allows us to study some properties of the problem. Using these properties we define a new local Search Neighborhood structure, which is then incorporated into the proposed tabu Search algorithm. To assess the performance of this algorithm, we present the results of an extensive experimental study, including an analysis of the tabu Search algorithm under different running conditions and a comparison with the state-of-the-art algorithms. The experiments are performed across two sets of conventional benchmarks with 960 and 17 instances respectively. The results demonstrate that the proposed tabu Search algorithm is superior to the state-of-the-art methods both in quality and stability. In particular, our algorithm establishes new best solutions for 817 of the 960 instances of the first set and reaches the best known solutions in 16 of the 17 instances of the second set.

  • CAEPIA - Weighted tardiness minimization in job shops with setup times by hybrid genetic algorithm
    Advances in Artificial Intelligence, 2011
    Co-Authors: Miguel A. González, Camino R. Vela, Ramiro Varela
    Abstract:

    In this paper we confront the weighted tardiness minimization in the job shop scheduling problem with sequence-dependent setup times. We start by extending an existing disjunctive graph model used for makespan minimization to represent the weighted tardiness problem. Using this representation, we adapt a local Search Neighborhood originally defined for makespan minimization. The proposed Neighborhood structure is used in a genetic algorithm hybridized with a simple tabu Search method. This algorithm is quite competitive with state-of-theart methods in solving problem instances from several datasets of both classical JSP and JSP with setup times.

  • Tabu Search and Genetic Algorithm for Scheduling with Total Flow Time Minimization
    2010
    Co-Authors: Miguel A. Gonz, Camino R. Vela, Ramiro Varela
    Abstract:

    In this paper we confront the job shop scheduling problem with total flow time minimization. We start extending the disjunctive graph model used for makespan minimization to represent the version of the problem with total flow time minimization. Using this representation, we adapt local Search Neighborhood structures originally defined for makespan minimization. The proposed Neighborhood structures are used in a genetic algorithm hybridized with a simple tabu Search method, outperforming state-of-the-art methods in solving problem instances from several datasets.

Julián M. Ortiz - One of the best experts on this subject based on the ideXlab platform.

  • Adapting a texture synthesis algorithm for conditional multiple point geostatistical simulation
    Stochastic Environmental Research and Risk Assessment, 2011
    Co-Authors: Álvaro Parra, Julián M. Ortiz
    Abstract:

    Computer vision provides several tools for analyzing and simulating textures. The principles of these techniques are similar to those in multiple-point geostatistics, namely, the reproduction of patterns and consistency in the results from a perceptual point of view, thus, ensuring the reproduction of long range connectivity. The only difference between these techniques and geostatistical simulation accounting for multiple-point statistics is that conditioning is not an issue in computer vision. We present a solution to the problem of conditioning simulated fields while simultaneously honoring multiple-point (pattern) statistics. The proposal is based on a texture synthesis algorithm where a fixed Search (causal) pattern is used. Conditioning is achieved by adding a non-causal Search Neighborhood that modifies the conditional distribution from which the simulated category is drawn, depending on the conditioning information. Results show an excellent reproduction of the features from the training image, while respecting the conditioning information. Some issues related to the data structure and to the computer efficiency are discussed.

Ivan Serina - One of the best experts on this subject based on the ideXlab platform.

  • An Empirical Analysis of Some Heuristic Features for Planning through Local Search and Action Graphs
    Fundamenta Informaticae, 2011
    Co-Authors: Alfonso Gerevini, Alessandro Saetti, Ivan Serina
    Abstract:

    Planning through local Search and action graphs is a powerful approach to fully-automated planning which is implemented in the well-known LPG planner. The approach is based on a stochastic local Search procedure exploring a space of partial plans and several heuristic features with different possible options. In this paper, we experimentally analyze the most important of them, with the goal of understanding and evaluating their impact on the performance of LPG, and of identifying default settings that work well on a large class of problems. In particular, we analyze several heuristic techniques for (a) evaluating the Search Neighborhood, (b) defining/restricting the Search Neighborhood, (c) selecting the next plan flaw to handle, (d) setting the “noise” parameter randomizing the Search, and (e) computing reachability information that can be exploited by the heuristic functions used to evaluate the Neighborhood elements. Some of these techniques were introduced in previous work on LPG, while others are new. Additional experimental results indicate that the current version of LPG using the identified best heuristic techniques as the default settings is competitive with the winner of the last (2008) International Planning Competition.

  • an approach to efficient planning with numerical fluents and multi criteria plan quality
    Artificial Intelligence, 2008
    Co-Authors: Alfonso Gerevini, Alessandro Saetti, Ivan Serina
    Abstract:

    Dealing with numerical information is practically important in many real-world planning domains where the executability of an action can depend on certain numerical conditions, and the action effects can consume or renew some critical continuous resources, which in pddl can be represented by numerical fluents. When a planning problem involves numerical fluents, the quality of the solutions can be expressed by an objective function that can take different plan quality criteria into account. We propose an incremental approach to automated planning with numerical fluents and multi-criteria objective functions for pddl numerical planning problems. The techniques in this paper significantly extend the framework of planning with action graphs and local Search implemented in the lpg planner. We define the numerical action graph (NA-graph) representation for numerical plans and we propose some new local Search techniques using this representation, including a heuristic Search Neighborhood for NA-graphs, a heuristic evaluation function based on relaxed numerical plans, and an incremental method for plan quality optimization based on particular Search restarts. Moreover, we analyze our approach through an extensive experimental study aimed at evaluating the importance of some specific techniques for the performance of the approach, and at analyzing its effectiveness in terms of fast computation of a valid plan and quality of the best plan that can be generated within a given CPU-time limit. Overall, the results show that our planner performs quite well compared to other state-of-the-art planners handling numerical fluents.

  • an empirical analysis of some heuristic features for local Search in lpg
    International Conference on Automated Planning and Scheduling, 2004
    Co-Authors: Alfonso Gerevini, Alessandro Saetti, Ivan Serina
    Abstract:

    LPG is a planner that performed very well in the last International planning competition (2002). The system is based on a stochastic local Search procedure, and it incorporates several heuristic features. In this paper we experimentally analyze the most important of them with the goal of understanding and evaluating their impact on the performance of the planner. In particular, we examine three heuristic functions for evaluating the Search Neighborhood and some settings of the "noise" parameter, that randomizes the next Search step for escaping from local minima. Moreover, we present and analyze additional heuristic techniques for restricting the Search Neighborhood and for selecting the next inconsistency to handle. The experimental results show that the use of such techniques significantly improves the performance of the planner.

  • ICAPS - An empirical analysis of some heuristic features for local Search in LPG
    2004
    Co-Authors: Alfonso Gerevini, Alessandro Saetti, Ivan Serina
    Abstract:

    LPG is a planner that performed very well in the last International planning competition (2002). The system is based on a stochastic local Search procedure, and it incorporates several heuristic features. In this paper we experimentally analyze the most important of them with the goal of understanding and evaluating their impact on the performance of the planner. In particular, we examine three heuristic functions for evaluating the Search Neighborhood and some settings of the "noise" parameter, that randomizes the next Search step for escaping from local minima. Moreover, we present and analyze additional heuristic techniques for restricting the Search Neighborhood and for selecting the next inconsistency to handle. The experimental results show that the use of such techniques significantly improves the performance of the planner.

Barrett W. Thomas - One of the best experts on this subject based on the ideXlab platform.

  • Network design for time-constrained delivery using subgraphs
    Computational Management Science, 2012
    Co-Authors: Hui Chen, Ann Melissa Campbell, Barrett W. Thomas
    Abstract:

    Delivery companies are offering an increasing number of time-definite services. Yet, little reSearch has been done that explores the design of delivery networks that can support these types of services. In this paper, we explore such design problems for networks with a specified number of edges \(B > n-1\), where \(n\) is the number of nodes in the problem. We present a two-phase heuristic solution approach that first constructs a network and then improves the network via local Search. For the improvement phase, we extend Neighborhood structures that have proven effective for tree-structured solutions and also identify a new Search Neighborhood that takes advantage of specific features of subgraph solutions. We present a computational analysis of our solution approach as well as managerial insights.

  • Network Design for Time-Constrained Delivery using
    2011
    Co-Authors: Hui Chen, Ann Melissa Campbell, Barrett W. Thomas
    Abstract:

    Delivery companies are oering an increasing number of time-denite services. Yet, little reSearch has been done that explores the design of delivery networks and accounts for these time-denite services. In this paper, we explore such design problems for networks with a specied number of edges B > n 1. We present a two-phase heuristic solution approach that rst constructs a network and then improves the network via local Search. For the improvement phase, we extend Neighborhood structures that have proven eective for tree-structured solutions and also identify a new Search Neighborhood that takes advantage of specic features of subgraph solutions. We present a computational analysis of our solution approach as well as managerial insights.