PDA

View Full Version : Normalizing times


Njalle
03-11-2010, 09:10 AM
Hi all - I'm new to this forum and I'm not sure if this is the right place to post. I've looked here and there in this forum, but I have not been able to find an answer to this question, but if there is one please give me the directions :)

I'm currently working on my Bachelor's thesis in Economics, where I'll investigate the market for racetrack betting in Denmark. I've collected data for all races in Denmark from 2003-2009, where I'll use 2009 as the holdout sample. I'm working on a multinominal (possibly binominal) logit model like those of Bolton and Chapman - and Benter, as described in Efficiency of Racetrack Betting Markets.

Bolton and Chapman normalize the race times by extracting point for each second the horses finish behind the track record, which is a good way to normalize between lengths and tracks. However, no track records exist for Denmark, since all races are run on grass. I've been thinking about how I alternatively could normalize the race times, but I have not found a good solution, which is why I come to you. :bang: My only solution would be some sort of interaction between variables, but it just seems... unsatisfactory, and I'm not even sure how to model it sufficiently.

Basically, the issue is that times differ between tracks (in fact, one of the tracks, Odense usually have race times faster than Dubai), and that grass is very sensitive to track conditons (so that horses run slower when the track is wet) - and actually also that tactics play a role. As you probably know, a very good field can run very slow because they are awaiting some sort of attack, and this in turn can lead to slower race times compared to a bad field without tactics.

I'd be extremely happy if you guys can help me out! :jump: