Le vendredi 04 décembre 2009 à 09:05 +0100, Michel Juillard a écrit :
Hi Houtan,
Dear Michel and Sébastien,
Everything is coded and tested for the svar_identification. However, I want to check my implementation of the maximum lag length (r) with you. In order to obtain r, I have introduced a temporary variable entitled max_endo_lag in ParsingDriver, which updates this variable every time add_model_variable() is called. Is this the best way to do this or can you think of a better way. I should point out that it is my understanding that the variables entitled like max_endo_lag in DynamicModel do not actually contain the original maximum endogenous lag in the model (which is what I understand should be the value for r). Otherwise, I would have simply included dynamic_model in the SvarIdentificationStatement class.
this is true. (S)VAR will be the only type of models that are left with lags on more than one period.
Actually for deterministic models we don't remove leads and lags of more than one period. Currently, the transformation is only applied for stochastic models.
What I don't understand is why you rely on add_model_variable() for computing a maximum endo lag, since as Michel points it, there is no model block when doing (S)VAR... If this means that you store VariableNodes inside SvarIdentificationStatement, this is probably a bad idea: you'd better symbol ids and leads/lags directly as integers.
Otherwise, to be more precise, what are you computing exactly? Since there is no model block, there is no concept of model maximum endogenous lag.
If you mean the max endo lag inside the svar_identification block, then this should be computed by a method inside the SvarIdentificationStatement class, depending on when you need that information... Doing this during the parsing pass breaks the idea of clearly separating the various tasks.
Sorry if I am missing something
Best