pyblp.ProblemResults.run_hansen_test

ProblemResults.run_hansen_test()

Test the validity of overidentifying restrictions with the Hansen \(J\) test.

Following Hansen (1982), the \(J\) statistic is

(1)\[J = N\bar{g}(\hat{\theta})'W\bar{g}(\hat{\theta})\]

where \(\bar{g}(\hat{\theta})\) is defined in (11) and \(W\) is the optimal weighting matrix in (24).

Note

The statistic can equivalently be written as \(J = Nq(\hat{\theta})\) where the GMM objective value is defined in (10), or the same but without the \(N\) if the GMM objective value was scaled by \(N\), which is the default behavior.

When the overidentifying restrictions in this model are valid, the \(J\) statistic is asymptotically \(\chi^2\) with degrees of freedom equal to the number of overidentifying restrictions. This requires that there are more moments than parameters.

Warning

This test requires ProblemResults.W to be an optimal weighting matrix, so it should typically be run only after two-step GMM or after one-step GMM with a pre-specified optimal weighting matrix.

Returns

The \(J\) statistic.

Return type

float

Examples