Skip to content

Implement opt-in stabilization by computing common operations on log scale #158

Closed
@ricardoV94

Description

@ricardoV94

Description

A naive way to obtain more stable computational graphs is to try to compute as much as possible on the log space and defer exponentiation until the last minute. Here is an example from investigating #90, with a naive implementation of the 2F1 function

def naive_2F1(a, b, c, z, terms):
    return np.sum(np.fromiter(
        (
            (sp.poch(a, n) * sp.poch(b, n)) / (sp.factorial(n) * sp.poch(c, n)) * z ** n
            for n in range(terms)
        ),
        dtype=np.float64
    ))
def log_poch(a, n):
    return sp.gammaln(a + n) - sp.gammaln(a)

def naive_log_2F1(a, b, c, z, terms):
    return np.exp(sp.logsumexp(np.fromiter(
        (
            (log_poch(a, n) + log_poch(b, n)) - (sp.gammaln(n + 1) + log_poch(c, n)) + n * np.log(z)
            for n in range(terms)
        ),
        dtype=np.float64
    )))

For the values (1, 1, 1, 0.5) the first function returns nan at 100 steps, whereas the second can go up to 1 at least 1 millon steps.

We probably don't want to rewrite such graphs by default (?), but would be nice to offer an opt-in rewrite for users.

Note: I don't mean we want to implement 2F1 like this (we don't). It was just used as an example of the type of stabilization rewrite that we could offer.

Metadata

Metadata

Assignees

No one assigned

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions