|
06-29-2016, 10:50 PM
|
#1
|
Grinding at a Poker Table
Join Date: Nov 2007
Posts: 4,902
|
Potential Application for Heavy-Duty Formulator User
I am a very heavy user of the DRF Formulator Past Performance racing data which I downloaded via the Export Center. For handicapping analysis and wagering opportunities I download daily virtually every track in the US and Canada running thoroughbred races, and my data source choice for something like the past 8-10 years has been the DRF Formulator files.
A topic I would like to discuss with fellow hard-core Formulator users is that of automation. Specifically, have any of you personally developed, or had developed for you, a program script which accesses the DRF website and downloads the desired tracks at a pre-set time each and every day, then places those files in a pre-defined directory?
My current process involves a significant number of manual clicks, AND, it obviously has to be done each day. Having an automated process would be a vice nice improvement.
Hard-core Formulator users and/or program developers please feel free to contact me via private message.
Thanks for any help and/or recommendations!
Chris
..
|
|
|
06-30-2016, 11:16 AM
|
#2
|
Registered User
Join Date: Mar 2005
Location: Queens, NY
Posts: 20,610
|
1. I download the tracks I am interested in manually (usually daily) after I've verified that the final versions are available. They go into a designated folder. There are several files per track per day. (Estimated time equals 1-2 minutes)
2. Once per week I run a script that consolidates all that info for the week. (estimated time equals 1-2 minutes)
So for example:
All the "running line data" for each track for each day becomes 1 file.
All the "horse info" for each track for that week becomes 1 file.
All the "race info" for each track for that week becomes 1 file.
3. Right after I finish Step 2, I run an Access Macro that imports all those consolidated files into the appropriate tables in my database. (estimated time equals 1 minute)
4. I manually look at 1 or 2 of the tables just to make sure all the data is there (this is me being neurotic) (estimated time equals 1 minute)
That's not a lot of work per week to keep my database up to date.
The result charts take more time because my process is slightly less automated at this stage. I'd estimate I spend 10-15 minutes per week updating the results tables.
Automating much of this was the smartest thing I did. That leaves me plenty of time to write queries against that data that shorten the handicapping process, calculate ratings that can be back and forward tested, and answer handicapping questions I might have that are not publicly available.
__________________
"Unlearning is the highest form of learning"
Last edited by classhandicapper; 06-30-2016 at 11:20 AM.
|
|
|
06-30-2016, 01:39 PM
|
#3
|
Grinding at a Poker Table
Join Date: Nov 2007
Posts: 4,902
|
Quote:
Originally Posted by classhandicapper
1. I download the tracks I am interested in manually (usually daily) after I've verified that the final versions are available. They go into a designated folder. There are several files per track per day. (Estimated time equals 1-2 minutes)
2. Once per week I run a script that consolidates all that info for the week. (estimated time equals 1-2 minutes)
So for example:
All the "running line data" for each track for each day becomes 1 file.
All the "horse info" for each track for that week becomes 1 file.
All the "race info" for each track for that week becomes 1 file.
3. Right after I finish Step 2, I run an Access Macro that imports all those consolidated files into the appropriate tables in my database. (estimated time equals 1 minute)
4. I manually look at 1 or 2 of the tables just to make sure all the data is there (this is me being neurotic) (estimated time equals 1 minute)
That's not a lot of work per week to keep my database up to date.
The result charts take more time because my process is slightly less automated at this stage. I'd estimate I spend 10-15 minutes per week updating the results tables.
Automating much of this was the smartest thing I did. That leaves me plenty of time to write queries against that data that shorten the handicapping process, calculate ratings that can be back and forward tested, and answer handicapping questions I might have that are not publicly available.
|
It is step 1 above that I am currently seeking automation for. It does not take a lot of time to do it manually, but being a somewhat repetitive task, it would seem logical that it would lend itself to some type of automation. I already do a number of other "more-manual" tasks outside the downloading process, so any time I can automate something it simplifies my racing activities.
When downloading (perhaps also known as Exporting), I first check the top box so that all the tracks for the specified date are selected, then manually uncheck the smallest of tracks and those that have split meets which happen to be running only Quarter Horse races. I am sure that part of the automated application would/should contain a desired track table which could be updated on the fly to include/exclude a track when running TB or QH races. Another complication in the application programing could be how to handle "early" vs. "final" versions of the PP file.
After downloading I of course already have "other" processes to read the data, do the calculations and analysis, then store the info in a database.
Last edited by Track Collector; 06-30-2016 at 01:41 PM.
|
|
|
06-30-2016, 03:10 PM
|
#4
|
Registered User
Join Date: Mar 2005
Location: Queens, NY
Posts: 20,610
|
Quote:
Originally Posted by Track Collector
It is step 1 above that I am currently seeking automation for. It does not take a lot of time to do it manually, but being a somewhat repetitive task, it would seem logical that it would lend itself to some type of automation. I already do a number of other "more-manual" tasks outside the downloading process, so any time I can automate something it simplifies my racing activities.
|
I hear you. I haven't put any energy into automating the download process because it's not very time consuming. I wouldn't know where to begin. Hopefully someone else can help. If you find an approach, let us know. Good luck.
__________________
"Unlearning is the highest form of learning"
Last edited by classhandicapper; 06-30-2016 at 03:12 PM.
|
|
|
07-07-2016, 04:19 PM
|
#5
|
Registered User
Join Date: Jun 2001
Location: Little Rock AR
Posts: 52
|
Quote:
Originally Posted by classhandicapper
1. I download the tracks I am interested in manually (usually daily) after I've verified that the final versions are available. They go into a designated folder. There are several files per track per day. (Estimated time equals 1-2 minutes)
2. Once per week I run a script that consolidates all that info for the week. (estimated time equals 1-2 minutes)
So for example:
All the "running line data" for each track for each day becomes 1 file.
All the "horse info" for each track for that week becomes 1 file.
All the "race info" for each track for that week becomes 1 file.
3. Right after I finish Step 2, I run an Access Macro that imports all those consolidated files into the appropriate tables in my database. (estimated time equals 1 minute)
4. I manually look at 1 or 2 of the tables just to make sure all the data is there (this is me being neurotic) (estimated time equals 1 minute)
That's not a lot of work per week to keep my database up to date.
The result charts take more time because my process is slightly less automated at this stage. I'd estimate I spend 10-15 minutes per week updating the results tables.
Automating much of this was the smartest thing I did. That leaves me plenty of time to write queries against that data that shorten the handicapping process, calculate ratings that can be back and forward tested, and answer handicapping questions I might have that are not publicly available.
|
I'm just lately considering using Formulator for the 2016 Saratoga meet. Interesting that you guys take all this DRF data and further manipulate.
If you don't mind sharing -- are you analyzing comments, hot (or slow) paced races, ..??
|
|
|
07-07-2016, 04:36 PM
|
#6
|
Registered User
Join Date: Mar 2005
Location: Queens, NY
Posts: 20,610
|
Once you get the hang of building a database and writing queries, it's easy to produce reports, statistics, and personal ratings that aren't available publicly and that would take a ton of time to do manually. If you take it further, you can also test various approaches against the results to see what works best in different types of races. What I have at my disposal now is mind boggling to me compared to just a couple of years ago. Questions that made my head spin for decades have been asked and answered. I don't have to learn by trial and error anymore and I can create the tools I always wished for by myself.
Yes, I have a very elaborate class and pace system that I am still enhancing.
I got a lot of help from people in this forum when I got started (sjk, MJC922, CJ, Raybo and others). I owe them all a drink.
This is the thread.
http://www.paceadvantage.com/forum/s...3&page=1&pp=15
__________________
"Unlearning is the highest form of learning"
Last edited by classhandicapper; 07-07-2016 at 04:42 PM.
|
|
|
07-07-2016, 06:01 PM
|
#7
|
Registered User
Join Date: Jun 2001
Location: Little Rock AR
Posts: 52
|
Quote:
Originally Posted by classhandicapper
Once you get the hang of building a database and writing queries, it's easy to produce reports, statistics, and personal ratings that aren't available publicly and that would take a ton of time to do manually. If you take it further, you can also test various approaches against the results to see what works best in different types of races. What I have at my disposal now is mind boggling to me compared to just a couple of years ago. Questions that made my head spin for decades have been asked and answered. I don't have to learn by trial and error anymore and I can create the tools I always wished for by myself.
Yes, I have a very elaborate class and pace system that I am still enhancing.
I got a lot of help from people in this forum when I got started (sjk, MJC922, CJ, Raybo and others). I owe them all a drink.
This is the thread.
http://www.paceadvantage.com/forum/s...3&page=1&pp=15
|
Thanks for the feedback. I'm an old man who's time has passed but I still love horse racing and admire those that are talented enough to take advantage of the technology that is available and continues to evolve.
Off to check out the link you provided :-)
|
|
|
07-07-2016, 07:03 PM
|
#8
|
Registered User
Join Date: Mar 2005
Location: Queens, NY
Posts: 20,610
|
Quote:
Originally Posted by Equifan
Thanks for the feedback. I'm an old man who's time has passed but I still love horse racing and admire those that are talented enough to take advantage of the technology that is available and continues to evolve.
Off to check out the link you provided :-)
|
Regardless of whether you use the export feature or not you should still give Formulator a try. It's not difficult to learn the basics of the PPs and tools quickly.
__________________
"Unlearning is the highest form of learning"
|
|
|
07-07-2016, 07:38 PM
|
#9
|
Registered User
Join Date: Oct 2012
Posts: 441
|
Quote:
Originally Posted by Track Collector
A topic I would like to discuss with fellow hard-core Formulator users is that of automation. Specifically, have any of you personally developed, or had developed for you, a program script which accesses the DRF website and downloads the desired tracks at a pre-set time each and every day, then places those files in a pre-defined directory?
My current process involves a significant number of manual clicks, AND, it obviously has to be done each day. Having an automated process would be a vice nice improvement.
|
Here is a scripting utility which may help you:
https://autohotkey.com/
It's free, relatively easy to use, and immensely useful.
|
|
|
|
Posting Rules
|
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts
HTML code is Off
|
|
|
|
|