Big data has the potential for altering the calculations by which data management groups buy, manage and structure information storage, but the search for the perfect metric could lead many organizations astray, according to analytics experts.
Many organizations, he said, may be cautious of moving forward on this because dollar-per-TB metrics are still “fuzzy.”
However, searching for the perfect metric will only bog down projects, according to Russom.
In reality, he said, most metrics are fuzzy anyway. “I think that in this case, it’s better to have a metric even if it’s fuzzy – or less than optimal – than not to have a metric of any kind at all.”
RELATED CONTENT
4 ways to tweak business analytics tools
The big data governance challenge
In a recent TDWI World Conference, Ken Rubin, head of analytics for Facebook, noted that many organizations become obsessed with developing the perfect metric that metrics don’t get used as effectively as they should.
While analysts cannot always do a “perfectly statistically controlled A/B test…there’s always some way to figure out how we’ve improved versus historical trends,” he said. “Let’s use our metrics to narrow it down” to make a decision.
There likely isn’t a golden algorithm, according to Russom. “If we learn nothing from business intelligence and data warehousing, it’s that every organization is very different on collection of sources they have, in-house skills, deployed platforms…and so on.”
He said organizations should take dollars-per-TB as a “bare-bones metric” and create their own algorithms out of it.