Yeah I see in that thread there were some questions around making the query dynamic. The best way to go down that path is start recording a macro, run through all of the steps I gave, then click 'stop recording' and open the macro up in 'edit' mode to see the VBA.
Notice there you will see the date which looks something like this:
With ActiveSheet.QueryTables.Add(Connection:= _
"URL;
http://www.racingpost.com/horses2/re...ome.sd?r_date=2016-01-14"
Wherever the date is located you will want to inject a new date string in the proper format such as:
With ActiveSheet.QueryTables.Add(Connection:= _
"URL;http://www.racingpost.com/horses2/results/home.sd?r_date=" & MyNewDate & ""
MyNewDate could be a string variable assigned from a custom function, a function you create for yourself which contains for example the Date() function (in VBA this returns todays date) minus a parameter that you feed it to subtract x number of days, and the 'x' would best be supplied by a counter inside the loop such as d = d +1. Within the function you would also then build the date string in the proper format.
So this macro would be configured to loop for best results i.e. you have it pull in the data starting from todays date, save it to another sheet or a csv, then increment the date back to yesterday using the custom function described and so on, so you keep going back one day until you have a year's worth of data or whatever you need. Put a timer in there to wait if you do this so you don't kill the server with a ton of requests all at once, maybe wait 10 seconds in-between asking for each day.
In my experience scraping a third-party site like this is something I would only look at as a way to get a quick archive of historical data for research purposes and not much more. That is, I wouldn't recommend putting all kinds of time into coding around scraping a third-party site on a daily basis for a long-term solution. Third-party sites eventually change the format of their web pages and suddenly everything will stop working for you even if it shifts just one cell over, the data you pulled into cell C20 today could be in G28 tomorrow, which requires modding all of the code you're using to 'handle' data in various ways. So it's not a good sustainable solution.
Following the research if you want something sustainable then I would look at coding for Trackmaster's (or another major providers) csv result charts, which aren't going to be free but the format will remain stable.
I think it's really unfortunate people have to resort to scraping in the first place. Racing should hand out a database with a years worth of data for research purposes. We have all of these people looking at fantasy sports, many would love to take a look at this sport in a serious way. I don't expect a 'current' db to be handed out every day unless people are paying for it. However all of the data from 2013 to 2014 let's say should be freely available IMO.