External Agent

14,000,000 Leading Edge Experts on the ideXlab platform

Scan Science and Technology

Contact Leading Edge Experts & Companies

Scan Science and Technology

Contact Leading Edge Experts & Companies

The Experts below are selected from a list of 37086 Experts worldwide ranked by ideXlab platform

Richard S Zemel - One of the best experts on this subject based on the ideXlab platform.

  • neural guided constraint logic programming for program synthesis
    Neural Information Processing Systems, 2018
    Co-Authors: Lisa Zhang, Gregory Rosenblatt, William E Byrd, Matthew Might, Ethan Fetaya, Renjie Liao, Raquel Urtasun, Richard S Zemel
    Abstract:

    Synthesizing programs using example input/outputs is a classic problem in artificial intelligence. We present a method for solving Programming By Example problems by using a neural model to guide the search of a constraint logic programming system called miniKanren. Internally, miniKanren represents a PBE problem as recursive constraints imposed by the provided examples. We present a Recurrent Neural Network model and a Gated Graph Neural Network model, both of which use these constraints as input to score candidate programs. We further present a transparent version of miniKanren that can be driven by an External Agent, suitable for use by other researchers. We show that our neural-guided approach using constraints can synthesize problems faster in many cases, and has the potential to generalize to larger problems.

  • neural guided constraint logic programming for program synthesis
    arXiv: Learning, 2018
    Co-Authors: Lisa Zhang, Gregory Rosenblatt, William E Byrd, Matthew Might, Ethan Fetaya, Renjie Liao, Raquel Urtasun, Richard S Zemel
    Abstract:

    Synthesizing programs using example input/outputs is a classic problem in artificial intelligence. We present a method for solving Programming By Example (PBE) problems by using a neural model to guide the search of a constraint logic programming system called miniKanren. Crucially, the neural model uses miniKanren's internal representation as input; miniKanren represents a PBE problem as recursive constraints imposed by the provided examples. We explore Recurrent Neural Network and Graph Neural Network models. We contribute a modified miniKanren, drivable by an External Agent, available at this https URL We show that our neural-guided approach using constraints can synthesize programs faster in many cases, and importantly, can generalize to larger problems.

  • neural guided constraint logic programming for program synthesis
    Neural Information Processing Systems, 2018
    Co-Authors: Lisa Zhang, Gregory Rosenblatt, William E Byrd, Matthew Might, Ethan Fetaya, Renjie Liao, Raquel Urtasun, Richard S Zemel
    Abstract:

    Synthesizing programs using example input/outputs is a classic problem in artificial intelligence. We present a method for solving Programming By Example (PBE) problems by using a neural model to guide the search of a constraint logic programming system called miniKanren. Crucially, the neural model uses miniKanren's internal representation as input; miniKanren represents a PBE problem as recursive constraints imposed by the provided examples. We explore Recurrent Neural Network and Graph Neural Network models. We contribute a modified miniKanren, drivable by an External Agent, available at https://github.com/xuexue/neuralkanren. We show that our neural-guided approach using constraints can synthesize programs faster in many cases, and importantly, can generalize to larger problems.

Lisa Zhang - One of the best experts on this subject based on the ideXlab platform.

  • neural guided constraint logic programming for program synthesis
    Neural Information Processing Systems, 2018
    Co-Authors: Lisa Zhang, Gregory Rosenblatt, William E Byrd, Matthew Might, Ethan Fetaya, Renjie Liao, Raquel Urtasun, Richard S Zemel
    Abstract:

    Synthesizing programs using example input/outputs is a classic problem in artificial intelligence. We present a method for solving Programming By Example problems by using a neural model to guide the search of a constraint logic programming system called miniKanren. Internally, miniKanren represents a PBE problem as recursive constraints imposed by the provided examples. We present a Recurrent Neural Network model and a Gated Graph Neural Network model, both of which use these constraints as input to score candidate programs. We further present a transparent version of miniKanren that can be driven by an External Agent, suitable for use by other researchers. We show that our neural-guided approach using constraints can synthesize problems faster in many cases, and has the potential to generalize to larger problems.

  • neural guided constraint logic programming for program synthesis
    arXiv: Learning, 2018
    Co-Authors: Lisa Zhang, Gregory Rosenblatt, William E Byrd, Matthew Might, Ethan Fetaya, Renjie Liao, Raquel Urtasun, Richard S Zemel
    Abstract:

    Synthesizing programs using example input/outputs is a classic problem in artificial intelligence. We present a method for solving Programming By Example (PBE) problems by using a neural model to guide the search of a constraint logic programming system called miniKanren. Crucially, the neural model uses miniKanren's internal representation as input; miniKanren represents a PBE problem as recursive constraints imposed by the provided examples. We explore Recurrent Neural Network and Graph Neural Network models. We contribute a modified miniKanren, drivable by an External Agent, available at this https URL We show that our neural-guided approach using constraints can synthesize programs faster in many cases, and importantly, can generalize to larger problems.

  • neural guided constraint logic programming for program synthesis
    Neural Information Processing Systems, 2018
    Co-Authors: Lisa Zhang, Gregory Rosenblatt, William E Byrd, Matthew Might, Ethan Fetaya, Renjie Liao, Raquel Urtasun, Richard S Zemel
    Abstract:

    Synthesizing programs using example input/outputs is a classic problem in artificial intelligence. We present a method for solving Programming By Example (PBE) problems by using a neural model to guide the search of a constraint logic programming system called miniKanren. Crucially, the neural model uses miniKanren's internal representation as input; miniKanren represents a PBE problem as recursive constraints imposed by the provided examples. We explore Recurrent Neural Network and Graph Neural Network models. We contribute a modified miniKanren, drivable by an External Agent, available at https://github.com/xuexue/neuralkanren. We show that our neural-guided approach using constraints can synthesize programs faster in many cases, and importantly, can generalize to larger problems.

Raquel Urtasun - One of the best experts on this subject based on the ideXlab platform.

  • neural guided constraint logic programming for program synthesis
    Neural Information Processing Systems, 2018
    Co-Authors: Lisa Zhang, Gregory Rosenblatt, William E Byrd, Matthew Might, Ethan Fetaya, Renjie Liao, Raquel Urtasun, Richard S Zemel
    Abstract:

    Synthesizing programs using example input/outputs is a classic problem in artificial intelligence. We present a method for solving Programming By Example problems by using a neural model to guide the search of a constraint logic programming system called miniKanren. Internally, miniKanren represents a PBE problem as recursive constraints imposed by the provided examples. We present a Recurrent Neural Network model and a Gated Graph Neural Network model, both of which use these constraints as input to score candidate programs. We further present a transparent version of miniKanren that can be driven by an External Agent, suitable for use by other researchers. We show that our neural-guided approach using constraints can synthesize problems faster in many cases, and has the potential to generalize to larger problems.

  • neural guided constraint logic programming for program synthesis
    arXiv: Learning, 2018
    Co-Authors: Lisa Zhang, Gregory Rosenblatt, William E Byrd, Matthew Might, Ethan Fetaya, Renjie Liao, Raquel Urtasun, Richard S Zemel
    Abstract:

    Synthesizing programs using example input/outputs is a classic problem in artificial intelligence. We present a method for solving Programming By Example (PBE) problems by using a neural model to guide the search of a constraint logic programming system called miniKanren. Crucially, the neural model uses miniKanren's internal representation as input; miniKanren represents a PBE problem as recursive constraints imposed by the provided examples. We explore Recurrent Neural Network and Graph Neural Network models. We contribute a modified miniKanren, drivable by an External Agent, available at this https URL We show that our neural-guided approach using constraints can synthesize programs faster in many cases, and importantly, can generalize to larger problems.

  • neural guided constraint logic programming for program synthesis
    Neural Information Processing Systems, 2018
    Co-Authors: Lisa Zhang, Gregory Rosenblatt, William E Byrd, Matthew Might, Ethan Fetaya, Renjie Liao, Raquel Urtasun, Richard S Zemel
    Abstract:

    Synthesizing programs using example input/outputs is a classic problem in artificial intelligence. We present a method for solving Programming By Example (PBE) problems by using a neural model to guide the search of a constraint logic programming system called miniKanren. Crucially, the neural model uses miniKanren's internal representation as input; miniKanren represents a PBE problem as recursive constraints imposed by the provided examples. We explore Recurrent Neural Network and Graph Neural Network models. We contribute a modified miniKanren, drivable by an External Agent, available at https://github.com/xuexue/neuralkanren. We show that our neural-guided approach using constraints can synthesize programs faster in many cases, and importantly, can generalize to larger problems.

Renjie Liao - One of the best experts on this subject based on the ideXlab platform.

  • neural guided constraint logic programming for program synthesis
    Neural Information Processing Systems, 2018
    Co-Authors: Lisa Zhang, Gregory Rosenblatt, William E Byrd, Matthew Might, Ethan Fetaya, Renjie Liao, Raquel Urtasun, Richard S Zemel
    Abstract:

    Synthesizing programs using example input/outputs is a classic problem in artificial intelligence. We present a method for solving Programming By Example problems by using a neural model to guide the search of a constraint logic programming system called miniKanren. Internally, miniKanren represents a PBE problem as recursive constraints imposed by the provided examples. We present a Recurrent Neural Network model and a Gated Graph Neural Network model, both of which use these constraints as input to score candidate programs. We further present a transparent version of miniKanren that can be driven by an External Agent, suitable for use by other researchers. We show that our neural-guided approach using constraints can synthesize problems faster in many cases, and has the potential to generalize to larger problems.

  • neural guided constraint logic programming for program synthesis
    arXiv: Learning, 2018
    Co-Authors: Lisa Zhang, Gregory Rosenblatt, William E Byrd, Matthew Might, Ethan Fetaya, Renjie Liao, Raquel Urtasun, Richard S Zemel
    Abstract:

    Synthesizing programs using example input/outputs is a classic problem in artificial intelligence. We present a method for solving Programming By Example (PBE) problems by using a neural model to guide the search of a constraint logic programming system called miniKanren. Crucially, the neural model uses miniKanren's internal representation as input; miniKanren represents a PBE problem as recursive constraints imposed by the provided examples. We explore Recurrent Neural Network and Graph Neural Network models. We contribute a modified miniKanren, drivable by an External Agent, available at this https URL We show that our neural-guided approach using constraints can synthesize programs faster in many cases, and importantly, can generalize to larger problems.

  • neural guided constraint logic programming for program synthesis
    Neural Information Processing Systems, 2018
    Co-Authors: Lisa Zhang, Gregory Rosenblatt, William E Byrd, Matthew Might, Ethan Fetaya, Renjie Liao, Raquel Urtasun, Richard S Zemel
    Abstract:

    Synthesizing programs using example input/outputs is a classic problem in artificial intelligence. We present a method for solving Programming By Example (PBE) problems by using a neural model to guide the search of a constraint logic programming system called miniKanren. Crucially, the neural model uses miniKanren's internal representation as input; miniKanren represents a PBE problem as recursive constraints imposed by the provided examples. We explore Recurrent Neural Network and Graph Neural Network models. We contribute a modified miniKanren, drivable by an External Agent, available at https://github.com/xuexue/neuralkanren. We show that our neural-guided approach using constraints can synthesize programs faster in many cases, and importantly, can generalize to larger problems.

Ethan Fetaya - One of the best experts on this subject based on the ideXlab platform.

  • neural guided constraint logic programming for program synthesis
    Neural Information Processing Systems, 2018
    Co-Authors: Lisa Zhang, Gregory Rosenblatt, William E Byrd, Matthew Might, Ethan Fetaya, Renjie Liao, Raquel Urtasun, Richard S Zemel
    Abstract:

    Synthesizing programs using example input/outputs is a classic problem in artificial intelligence. We present a method for solving Programming By Example problems by using a neural model to guide the search of a constraint logic programming system called miniKanren. Internally, miniKanren represents a PBE problem as recursive constraints imposed by the provided examples. We present a Recurrent Neural Network model and a Gated Graph Neural Network model, both of which use these constraints as input to score candidate programs. We further present a transparent version of miniKanren that can be driven by an External Agent, suitable for use by other researchers. We show that our neural-guided approach using constraints can synthesize problems faster in many cases, and has the potential to generalize to larger problems.

  • neural guided constraint logic programming for program synthesis
    arXiv: Learning, 2018
    Co-Authors: Lisa Zhang, Gregory Rosenblatt, William E Byrd, Matthew Might, Ethan Fetaya, Renjie Liao, Raquel Urtasun, Richard S Zemel
    Abstract:

    Synthesizing programs using example input/outputs is a classic problem in artificial intelligence. We present a method for solving Programming By Example (PBE) problems by using a neural model to guide the search of a constraint logic programming system called miniKanren. Crucially, the neural model uses miniKanren's internal representation as input; miniKanren represents a PBE problem as recursive constraints imposed by the provided examples. We explore Recurrent Neural Network and Graph Neural Network models. We contribute a modified miniKanren, drivable by an External Agent, available at this https URL We show that our neural-guided approach using constraints can synthesize programs faster in many cases, and importantly, can generalize to larger problems.

  • neural guided constraint logic programming for program synthesis
    Neural Information Processing Systems, 2018
    Co-Authors: Lisa Zhang, Gregory Rosenblatt, William E Byrd, Matthew Might, Ethan Fetaya, Renjie Liao, Raquel Urtasun, Richard S Zemel
    Abstract:

    Synthesizing programs using example input/outputs is a classic problem in artificial intelligence. We present a method for solving Programming By Example (PBE) problems by using a neural model to guide the search of a constraint logic programming system called miniKanren. Crucially, the neural model uses miniKanren's internal representation as input; miniKanren represents a PBE problem as recursive constraints imposed by the provided examples. We explore Recurrent Neural Network and Graph Neural Network models. We contribute a modified miniKanren, drivable by an External Agent, available at https://github.com/xuexue/neuralkanren. We show that our neural-guided approach using constraints can synthesize programs faster in many cases, and importantly, can generalize to larger problems.