在不运行单独模型的情况下从先前采样 [英] Sampling from prior without running a separate model

查看:96
本文介绍了在不运行单独模型的情况下从先前采样的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我想针对这些参数的先验来绘制stan模型的参数估计值的直方图.我尝试通过在stan中运行一个模型,用ggplot2对其进行图形处理,然后使用R的随机生成器函数(例如rnorm()rbinom())覆盖一个先验分布的近似值来进行此操作,但是我遇到了许多缩放问题,这些问题使得这些图不可能正确显示.

I want to graph the histograms of parameter estimates from a stan model against the priors for those parameters. I have tried doing this by running a model in stan, graphing it with ggplot2, then overlaying an approximation of the prior distribution using R's random generator function (e.g. rnorm(), rbinom()) but I have run into many scaling issues that make the graphs impossible to get looking right.

我在想更好的方法是直接从先前的分布中直接采样,然后根据参数估计值绘制这些采样,但是运行一个完整的独立模型 just 来从先验似乎很耗时.我想知道是否有一种方法可以在现有模型内甚至与之并行.

I was thinking a better way to do it would be simply to sample directly from the prior distribution and then graph those samples against the parameter estimates, but running a whole separate model just to sample from the priors seems very time-consuming. I was wondering if there was a way to do this within, or rather parallel-to, an existing model.

这是示例脚本.

# simulate linear model
a <- 3 # intercept
b <- 2 # slope

# data
x <- rnorm(28, 0, 1)
eps <- rnorm(28, 0, 2)
y <- a + b*x + eps

# put data into list
data_reg <- list(N = 28, x = x, y = y)

# create the model string

ms <- "
    data {
    int<lower=0> N;
    vector[N] x;
    vector[N] y;
    }
    parameters {
    real alpha;
    real beta;
    real<lower=0> sigma;
    }
    model {
    vector[N] mu;
    sigma ~ cauchy(0, 2);
    beta ~ normal(0,10);
    alpha ~ normal(0,100);
    for ( i in 1:N ) {
    mu[i] = alpha + beta * x[i];
    }
    y ~ normal(mu, sigma);
    }
"

# now fit the model in stan
fit1 <- stan(model_code = ms,     # model string
             data = data_reg,        # named list of data
             chains = 1,             # number of Markov chains
             warmup = 1e3,          # number of warmup iterations per chain
             iter = 2e3)         # show progress every 'refresh' iterations

# extract the sample estimates
post <- extract(fit1, pars = c("alpha", "beta", "sigma"))

# now for the density plots. Write a plotting function
densFunct <- function (parName) {
  g <- ggplot(postDF, aes_string(x = parName)) + 
              geom_histogram(aes(y=..density..), fill = "white", colour = "black", bins = 50) +
              geom_density(fill = "skyblue", alpha = 0.3)
  return(g)
}

# plot 
gridExtra::grid.arrange(grobs = lapply(names(postDF), function (i) densFunct(i)), ncol = 1)

现在我了解到,可以像这样简单地从模型字符串中省略似然性,从而可以从先验样本中提取样本

Now I understand that I can sample from the prior by simply omitting the likelihood from the model string, like so

ms <- "
  data {
    int<lower=0> N;
    vector[N] x;
    vector[N] y;
  }
  parameters {
    real alpha;
    real beta;
    real<lower=0> sigma;
  }
  model {
    sigma ~ cauchy(0, 2);
    beta ~ normal(0,10);
    alpha ~ normal(0,100);
  }
"

但是,有什么方法可以在第一个模型中从先前的样本中获取样本吗?也许通过生成的数量块?

But is there any way to get the samples from the prior within the first model? Maybe via the generated quantities block?

推荐答案

有两种方法可以做到这一点.

There are two ways you can do this.

首先,如果程序足够通用,则只需传入零大小的数据,以使后验为先验.例如,在您给出的回归示例中,N = 0将起作用(以及正确的零尺寸x和y).

First, if the program is general enough, just pass in zero-size data so that the posterior is the prior. For instance, N = 0 in the regression example you gave will work (along with the right zero-sized x and y).

第二,您可以在生成的数量块中编写一个纯Monte Carlo生成器(不使用MCMC).像这样:

Second, you can write a pure Monte Carlo generator (doesn't use MCMC) in the generated quantities block. Something like:

generated quantities {
  real<lower = 0> sigma_sim = cauchy_rng(0, 2);  // wide tail warning!
  real beta_sim = normal_rng(0, 10);
  real alpha_sim = normal_rng(0, 20);
}

第二种方法效率更高,因为它可以方便地绘制独立的样本,而无需执行任何MCMC.

The second approach is much more efficient as it conveniently draws an independent sample and doesn't have to do any MCMC.

这篇关于在不运行单独模型的情况下从先前采样的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆