The heartbeat of my personal handicapping tools (a work in progress) is knowing the average times (final times and 2nd call times) for each distance at the track(s) in play. Additionally, I need the ability to occasionally generate that data to ensure my system is using as close as up-to-date figures as possible, especially as the seasons shift and the figures begin to move up or down accordingly. The standard 'Equalization charts', freely available, simply aren't good enough IMHO.
Having that in mind, I designed a small program that will batch-parse single-format files in the DRF format (and probably EG and MCP formats -- although not tested) and create reports of the average final times (and as a bonus ... the 2nd call time) per each distance at each track in the files.
My test case was ~80 DRF files resulting in ~44,000 races of which ~10,000 races were uniq (thus eligible to use for averaging) and had been run at dozens of tracks.
I have the command-line version finished and plan on stuffing it in a GUI as part of my toolkit (another work in progress) where the user will be able to select tracks to consider and tracks to skip and so forth. For now it's an all-or-nothing program which is probably overkill (yet fun nonetheless). The user can simply cut and paste the results from the tracks of interest and toss the rest.
I don't have a clue whether this program would be useful to anyone but for those that would like to try it (or use it) I offer it free at this link ...
http://members.cox.net/thewinnerscode/task.html
In a nutshell
=============
1) The program name is TASK.exe (Track Average Speed Kalculator)
2) It is a win32 program
3) After downloading you can place task.exe in any folder and execute the program by typing at the prompt 'task <Enter>' (without the quotes -- which you already knew)
4) In the same folder as task.exe, create a new sub-folder named 'datafiles' (again no quotes)
5) In the datafiles folder place your single-format files (DRF, EG, MCP) ... there is no limit to how many, just keep in mind you should start with a reasonable amount to get the feel of how long it will take to process. My test case mentioned above (80 files) processed in ~10 seconds on my 3-gig core 2 duo with 4 gigs of ram and XP.... but the processing time is not linear as all the heavy lifting is done with data structures and they will increase in size exponentially as the file number grows.
6) Hunt races, Steeplechase races and about distances are not considered.
7) Only Fast tracks ad Firm tracks are considered ... no off-tracks are allowed to pass the fileters.
8) Once everything is in place, execute the program by typing at the prompt 'task <Enter>' (again no quotes).
9) You will see some screen splashes to let you know something is happening.
10) The program creates 3 files, a task_rpt (by date) text file which is just that ... a formatted text file. A csv file for opening in Excel and a RAW file which is a text file showing the raw data of the unique races with the format ... track, race number, date, surface, distance, 2nd call, final call.
this file is intended for those that wish to check the accuracy of any of the averaging. Probably not too many takers on this one.
I have taken care to clean up the known anomalies of the single-file formats but being human I offer a caveat emptor ... there is no way possible for me to check the accuracy of every entry so check (as best as you can) the figures before using them in your program. I believe they are accurate but a 2nd set of eyes never hurts.
Your comments are welcome.
Thanks