Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Importance funsor #578

Draft
wants to merge 8 commits into
base: master
Choose a base branch
from
Draft

Importance funsor #578

wants to merge 8 commits into from

Conversation

ordabayevy
Copy link
Member

@ordabayevy ordabayevy commented Nov 3, 2021

Importance sampling is represented by an Importance funsor.

  1. Signature - Importance(model, guide, sampled_vars).
  2. When guide is a Delta it eagerly evaluates to guide + model - guide.
  3. Importance.reduce is delegated to Importance.model.reduce.
  4. (not implemented) consider implementing MonteCarlo interpretation when guide is not a Delta.

Dice factor as an importance weight

model = Delta(name, point, log_prob)
guide = Delta(name, point, ops.detach(log_prob))
Importance(model, guide, name)
    == guide + model - guide
    == guide + log_prob - ops.detach(log_prob)
    == guide + dice_factor

Lazy interpretation

lazy_importance = DispatchedInterpretation("lazy_importance")

@lazy_importance.register(Importance, Funsor, Delta, frozenset)
def _lazy_importance(model, guide, sampled_vars):
    return reflect.interpret(Importance, model, guide, sampled_vars)

It is used for a lazy importance sampling:

with lazy_importance:
    sampled_dist = dist.sample(msg["name"], sample_inputs)

and for adjoint algorithm:

with lazy_importance:
    marginals = adjoint(funsor.ops.logaddexp, funsor.ops.add, logzq)

Separability

model_a = Delta(“a”, point_a[“i”], log_prob_a)
guide_a = Delta(“a”, point_a[“i”], ops.detach(log_prob_a))
q_a = Importance(model_a, guide_a, {“a”})

model_b = Delta(“b”, point_b[“i”], log_prob_b)
guide_b = Delta(“b”, point_b[“i”], ops.detach(log_prob_b))
q_b = Importance(model_b, guide_b, {“b”})

with lazy_importance:
    (q_a.exp() * q_b.exp() * cost_b).reduce(add, {“a”, “b”, “i”})
    == [q_a.exp().reduce(add, “a”) * (q_b.exp() * cost_b).reduce(add, {“b”})].reduce(add, “i”)
    == [1(“i”) * (q_b.exp().reduce(add, {“b”}) + cost_b(b=point_b))].reduce(add, “i”)
    == [1(“i”) * 1("i") * cost_b(b=point_b)].reduce(add, “i”)
    == cost_b(b=point_b).reduce(add, “i”)

@ordabayevy ordabayevy added the WIP label Nov 3, 2021
@ordabayevy ordabayevy changed the title Importance funsor Use Approximate(ops.sample, ...) for importance sampling Nov 3, 2021
@ordabayevy ordabayevy marked this pull request as ready for review November 4, 2021 05:32
@ordabayevy ordabayevy changed the title Use Approximate(ops.sample, ...) for importance sampling Importance funsor Nov 4, 2021
@ordabayevy ordabayevy marked this pull request as draft April 12, 2022 02:02
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

1 participant