In dsge_likelihood, there are several instances of penalty set to 100 (or 1) when there is no way to compute an amount proportional to the violation.
Such values seem large to me and of nature of creating cliffs in the objective function. It seems that a value such as 0.1 should be sufficient as it is always added to the last accepted value of the objective function.
What do you think?
I agree. There are too many cases where the optimizers get stuck. This can only help.
Am 20.09.2015 um 15:59 schrieb Michel Juillard:
In dsge_likelihood, there are several instances of penalty set to 100 (or 1) when there is no way to compute an amount proportional to the violation.
Such values seem large to me and of nature of creating cliffs in the objective function. It seems that a value such as 0.1 should be sufficient as it is always added to the last accepted value of the objective function.
What do you think?