Quote:
Originally Posted by ReplayRandall
Here's an example of using big data sets at a slightly losing ROI of 93-95%. If this specific data set has consistently shown these numbers for the last 3 years, I look to see how they are doing after 100 plays. If they are severely under-performing, say at a 60% rate of return, I will have the confidence based on the data to bet these specific sets HARD, until they return close to their mean performance, like an under-valued stock that has a great balance sheet, good fundamentals, product line, but for some unknown reason has fallen out of favor with the market crowd......A contrarian concept using data which most operators throw away for lack of a +ROI, but is consistent at 93-95% as they come...$$$
|
Great! This is the way to go. Still, these approach has nothing to do with big data which is the theme of this thread. "Big data" is not (necessarily) about the absolute size of the data but about the processing methodology.