When computing profile likelihood confidence intervals for parameters determined by nonlinear regression, why does Prism sometimes report "infinity" or "???" instead of a value?
When does Prism report "infinity"?
Prism reports that the upper confidence limit is infinity when the model and data simply don't define the upper confidence limit. If you are aiming for 95% confidence limits, Prism will report that the limit is infinite when the confidence level is less than 95% no matter how large the algorithm sets that confidence limit.
Similarly, Prism reports -infinity when the model and data simply don't define the lower confidence limit.
When you see that a confidence limit is infinity (or -infinity), you can conclude that your data simply don't define that parameter in the model very well. One example would be if you are fitting an asymmetrical ("five parameter") log(dose)-response curve, and the confidence limit for the asymmetry parameter is infinity. This can happen when your data form a symmetrical log(dose)-response curve, so there is no information in the data to define the asymmetry parameter.
When does Prism report "???" ?
Prism reports ??? for a confidence limit when its calculations got interrupted, so it was unable to compute the limit.
Why would the calculations get interrupted? There are several reasons.
Simple case you can easily work around in Prism 7 (but not 8)
Computing the profile likelihood intervals is an iterative process. You set the maximum number of iterations in the Diagnostics tab of the nonlinear regression dialog. In Prism 7, the same setting is used to restrict the number of iterations for the main nonlinear regression (to find te best-fit values of the parameters) and also for computing the profile likelihood confidence intervals.
To find out if this is the reason for ???, enter a higher limit in the Diagnostics tab. If this was the problem, Prism 7 will now report the confidence limit rather than ???. Prism 8 works differently and this workaround won't help.
Another possible case you can work around
It is conceivable that the problem is due to the data being super huge or really tiny values. If all your data look something like 1.23e-45 or 1.23e45 try transforming to different units so the values are not so incredibly large or small. It is possible that this might be helpful and let Prism fine the confidence limits.
Other reasons that you cannot work around
There are other reasons for the calculations to get interrupted so Prism reports ???. They happen when the model equation is fundamentally difficult to fit, and when the data simply don't define the model very well. This could be because of too few values, no values at important ranges of X values, too much scatter, etc. Even though Prism can't explain why it reports ???, you can conclude that there is no way to compute a reasonable short useful confidence interval given those data and that model.
When does Prism report "very wide"?
Prism reports that a confidence interval is "very wide" when the fit is ambiguous.
Keywords: ??, ???, ????, ?????, infinite, CI, PL,