Skip to main content

Bayes-Laplace Rule

Name

UxHwDoubleBayesLaplace, UxHwFloatBayesLaplace — Perform Bayesian inference with distributional arithmetic.

Synopsis

#include <uxhw.h>

double UxHwDoubleBayesLaplace(
double (*statisticalModel)(void *, double),
void * modelParams,
double prior,
double observedData,
size_t numberOfObservations
);

float UxHwFloatBayesLaplace(
float (*statisticalModel)(void *, float),
void * modelParams,
float prior,
float observedData,
size_t numberOfObservations
);

Description

Performs Bayesian inference to calculate and return the posterior distribution of the parameter θ\theta using Bayes' rule (also known as Bayes-Laplace rule):

P(θdata)=P(dataθ)P(θ)P(dataθ)P(θ)dθ.P(\theta \mid \text{data}) = \frac{P(\text{data} \mid \theta) \cdot P(\theta)}{\int P(\text{data} \mid \theta') \cdot P(\theta') \, d\theta'}.

This function takes a probability distribution P(dataθ)P(\text{data} | \theta) describing the statistical model of the observed data, a probability distribution P(θ)P(θ) describing the prior distribution, and observed data, where each observation is assumed to be independent and distributed as described by P(dataθ)P(\text{data} | \theta).

It includes a normalization constant, calculated from the marginal likelihood.

Parameters

  • statisticalModel — A pointer to a function that calculates the distribution P(dataθ)P(\text{data} | \theta). Takes parameters of modelParams and a particle valued theta. For a given theta, this function returns a distribution for the given observations.

  • modelParams — A pointer to additional parameters required by the distribution P(dataθ)P(\text{data} | \theta) (such as known variances or calibration constants). Pass NULL if none.

  • prior — The prior distribution P(θ)P(\theta) of the parameter θ\theta. This can be a parametric distribution (e.g., UxHwDoubleUniformDist(), UxHwDoubleGaussDist()) or an empirical distribution created from samples (e.g., UxHwDoubleDistFromSamples()).

  • observedData — The empirical distribution of observed data:

    • For a single observation, pass the measured value directly (treated as a point-mass distribution).
    • For multiple observations, create a distribution using UxHwDoubleDistFromSamples(samples, count).
  • numberOfObservations — The number of independent observations in observedData. This parameter scales the joint likelihood across multiple observations, such that P(dataθ)P(\text{data} | \theta) is correctly computed as i=1nP(dataiθ)\prod_{i=1}^n P(\text{data}_i | \theta).

Return Values

Returns the posterior distribution P(θdata)P(\theta | \text{data}).

✏️   Example: Inferring a Sensor's True Value

#include <uxhw.h>
#include <stdio.h>

typedef struct
{
double variance;
double scale;
} ModelParameters;

/*
* Statistical model: P(observation | theta) (also known as likelihood or sampling distribution).
* Models sensor measurements with Gaussian noise.
*/
double
sensorMeasurementModel(void * args, double theta)
{
ModelParameters * p = (ModelParameters *)args;

return UxHwDoubleGaussDist(theta, sqrt(p->variance) * p->scale);
}

enum
{
kNumObservations = 5,
};

int
main(void)
{
/*
* Prior: true value (theta) uniformly distributed between 0 and 2
*/
double prior = UxHwDoubleUniformDist(0.0, 2.0);

/*
* Observed measurements from sensor
*/
double measurements[kNumObservations] = {0.9, 0.95, 0.99, 1.1, 1.2};
double observedData = UxHwDoubleDistFromSamples(measurements, kNumObservations);

/*
* Set the model specific parameters
*/
ModelParameters params = {.variance = 0.04, .scale = 2.0};

/*
* Compute posterior distribution over true value
*/
double posterior = UxHwDoubleBayesLaplace(
&sensorMeasurementModel,
&params,
prior,
observedData,
kNumObservations
);

printf("Prior: %lf\n", prior);
printf("Posterior: %lf\n", posterior);

return 0;
}

✏️   Advanced Example: Importance Sampling via Bayesian Inference

#include <uxhw.h>
#include <stdio.h>
#include <math.h>

/*
* Importance Sampling: Target p(x) = Gaussian( µ = 5.0, σ = 5.0)
* Proposal q(x) = Uniform[0, 10]
* Weight w(x) = p(x) / q(x)
*/
typedef struct
{
double mu;
double sigma;
double proposedDensity;
} ImportanceSamplingParameters;

/*
*
*/
double const kImportanceScale = 0.1;
double const kImportanceShift = 1e-10;

double
importanceWeightModel(void * params, double x)
{
ImportanceSamplingParameters * p = (ImportanceSamplingParameters *) params;
/*
* Target distribution: p(x) = Gaussian(5.0, 0.5)
*/
double targetDensity = exp(-0.5 * pow((x - p->mu) / p->sigma, 2.0))
/ (p->sigma * sqrt(2.0 * M_PI));

/*
* Importance weight: w(x) = p(x) / q(x)
*/
double weight = targetDensity / p->proposedDensity;

/*
* Return a likelihood that evaluates to the importance weight.
* We use a narrow Gaussian at the observed data point with
* amplitude proportional to the weight.
*/
double sigma = kImportanceScale / (weight + kImportanceShift);

return UxHwDoubleGaussDist(0.0, sigma);
}

int
main(void)
{
/*
* Proposal distribution (our "prior"): Uniform[0, 10]
* Easy to sample from, but wrong shape
*/
ImportanceSamplingParameters params = {.mu = 5.0, .sigma = 5.0, .proposedDensity = 0.1};

double proposal = UxHwDoubleUniformDist(0.0, 10.0);
printf("Proposal mean: %lf (uniform 0-10)\n", proposal);

/*
* Apply importance sampling via Bayes' rule:
* The likelihood re-weights uniform samples to match Gaussian(5.0, 0.5)
*/
double target = UxHwDoubleBayesLaplace(
&importanceWeightModel,
&params,
proposal,
0.0, /* reference point for likelihood evaluation */
1
);

printf("Target mean: %lf (should be ~5.0)\n", target);

return 0;
}

Notes

This function performs a distributional update according to Bayes' rule. It correctly computes the marginal likelihood (the normalization constant in the denominator), which is often analytically intractable, allowing you to focus on defining the prior and likelihood model.

UxHwDoubleBayesLaplace() uses the internal distribution representation of its prior parameter to evaluate the statisticalModel() function. This allows it to efficiently evaluate the statisticalModel() function across the entire prior distribution, without sampling. UxHwDoubleBayesLaplace() uses the distribution specified by statisticalModel(), together with the provided observedData distribution, to compute the likelihood of observing the observedData, given the prior distribution prior. The routine UxHwDoubleBayesLaplace() then multiplies the obtained likelihood with the prior to obtain the posterior distribution. UxHwDoubleBayesLaplace() returns the posterior computed in this manner.

Unlike MCMC (which requires sampling and convergence diagnostics), variational inference (which uses mean-field approximations), or Laplace approximation (which relies on unimodality and smoothness assumptions on the probability density function), this distributional approach computes posteriors directly without imposing structural assumptions on the posterior distribution.