AI/ML

Gradient Descent over Metagrammars for Syntax-Guided Synthesis. (arXiv:2007.06677v1 [cs.SE])


The performance of a syntax-guided synthesis algorithm is highly dependent on
the provision of a good syntactic template, or grammar. Provision of such a
template is often left to the user to do manually, though in the absence of
such a grammar, state-of-the-art solvers will provide their own default
grammar, which is dependent on the signature of the target program to be
sythesized. In this work, we speculate this default grammar could be improved
upon substantially. We build sets of rules, or metagrammars, for constructing
grammars, and perform a gradient descent over these metagrammars aiming to find
a metagrammar which solves more benchmarks and on average faster. We show the
resulting metagrammar enables CVC4 to solve 26% more benchmarks than the
default grammar within a 300s time-out, and that metagrammars learnt from tens
of benchmarks generalize to performance on 100s of benchmarks.

Source link

Related posts

HIMSS Introduces Program on Data, Insights and Strategies for the Health Enterprise

Newsemia

clinical machine learning; +1086 new citations

Newsemia

Leveraging the value of data

Newsemia

This website uses cookies to improve your experience. We'll assume you're ok with this, but you can opt-out if you wish. Accept Read More

Privacy & Cookies Policy

COVID-19

COVID-19 (Coronavirus) is a new illness that is having a major effect on all businesses globally LIVE COVID-19 STATISTICS FOR World