« German unions in a "parlous state" | Main | Global recovery on track: OECD »

Monday, March 06, 2006

Comments

Lord

Or fitting the first half of the data and predicting the second half to see how well it holds up.

Marc Shivers

I'm guessing I'm in the minority on this, but when it comes to trying to understand the underlying structure of a set of data, by far the most productive approach for me has been visualization first, then econometrics (and then repeat several times).

New Economist

I'm with you, Marc - it's an iterative process.

Naunihal Singh

It's tricky to do what Marc suggests without curve fitting, especially when you iterate through repeated attempts to predict the data as well as you can. There's a good argument for enumerating your theoretical beliefs about the data generating process before you dive in.

Arthur Eckart

Some econometricians (fortunately few) use variables to fit their theories rather than testing an unbiased model to allow new theories to fall from the data. Also, I was taught a scattergram (or other graphs) is part of the methodology. I agree with Naunihal that a sound model, i.e. that has all of the most relevant factors, is most important.

Arthur Eckart

Sound model i.e. with the most relevant factors tested for serial correlation, hetroskedasticity, omitted variable bias, etc.

The comments to this entry are closed.

Information




  • TEST


  • Subscribe in NewsGator Online

Economist Weblogs

Categories

Disclaimer


  • This is a personal web site, produced in my own time and solely reflecting my personal opinions. Statements on this site do not represent the views or policies of my employer, past or present, or any other organisation with which I may be affiliated. The information on this site is provided for discussion purposes only, and are not investing recommendations. Under no circumstances does this information represent a recommendation to buy or sell securities.