Pseudo-likelihood combining iterated filtering and probe matching #206
-
This is more like a question than a request. My data is noisy with weak dynamical signals, despite the fitted filtering median from |
Beta Was this translation helpful? Give feedback.
Replies: 4 comments
-
@Fuhan-Yang : it's an interesting question. Theoretically, the likelihood is a kind of "master summary statistic" in the sense that no other summary statistic can give information that adds to that given by the likelihood itself. If you find that the model you are proposing is not capturing the signals you think you see in the data, you might propose a better model. If it turns out to be very difficult to imagine a better model, then synthetic likelihood becomes an attractive option, since it allows you to pick out those features of the data that you want the model to match and to ignore the rest. |
Beta Was this translation helpful? Give feedback.
-
Thanks for the explanation! In my understanding, probe matching calculates the normal synthetic likelihood of the |
Beta Was this translation helpful? Give feedback.
-
@Fuhan-Yang : these are also good questions. One can approach the problem of maximizing a noisy function like the synthetic likelihood in two ways. First, as you suggest, one can fix the seed, turning the noisy function into a deterministic but at least somewhat rugged function. Deterministic optimizers can then be applied, but the ruggedness means that one has to beware of local maxima that are globally sub-optimal. In addition, as you say, one has to give some thought to the dependence of the surface on the random seed. If the summary statistics combine information from many observations---as they often do---then the differences due to the random seed will very often be quite small. In any case, the statistical uncertainty in parameter estimates is distinct from the Monte Carlo error. The second approach is to use a stochastic optimizer on the noisy function. One might even be so bold as to throw a nominally deterministic but quite robust optimizer such as Nelder-Mead at the noisy function. This has been known to give good results. |
Beta Was this translation helpful? Give feedback.
-
I see. As the function became deterministic by setting seed, the uncertainty should not come from stochasticity but more is an estimate error. In this case, we can use standard method (such as profile likelihood) to get estimate uncertainty, but also recognize the extra uncertainty from various simulations. I guess this would be fine for model comparisons, but may still bring issues in forecasting. I'm trying the first approach but using a stochastic optimizer, simulating annealing. The deterministic optimizer (Nelder-Mead or MCMC) could barely move further from initial parameters, which may indicate the surface ruggedness. So now I use simulated annealing on the entire parameter ranges. Then I narrow the parameter range (by selecting the parameters with top 5% likelihood) and repeat the analysis. It seems working as the likelihood can reach and stay maximum. Simulated annealing behaves better than Nelder-Mead. I haven't tried other stochastic optimizer similar to mif2 (the one which simulates and estimates at the same time) in probe matching, if you have any experiences, I would like to hear! |
Beta Was this translation helpful? Give feedback.
@Fuhan-Yang : these are also good questions. One can approach the problem of maximizing a noisy function like the synthetic likelihood in two ways. First, as you suggest, one can fix the seed, turning the noisy function into a deterministic but at least somewhat rugged function. Deterministic optimizers can then be applied, but the ruggedness means that one has to beware of local maxima that are globally sub-optimal. In addition, as you say, one has to give some thought to the dependence of the surface on the random seed. If the summary statistics combine information from many observations---as they often do---then the differences due to the random seed will very often be quite small. In …