7 sure-fire ways to fail at data analytics

Bob Violino

“The right people are passionate about using data to answer questions and then are willing to constantly question their findings to make sure the data is not just fitting a narrative but can explain what we are seeing and helping to predict where we are going,” Eaton says. “It is important that everyone knows what we are trying to find with the data and our overall goals, and to collect consistent measurements and data.”

A sure recipe for failure is lacking focus when launching an analytics effort. “Data teams will be most successful when they are focused on a prioritized set of outcomes,” says Christina Clark, chief data officer at multinational conglomerate GE. “Often teams will fail because they are expected to address too many business demands at once, ultimately being stretched too thin and not making meaningful impact to maintain interest or funding.”


2. Build (and maintain) your own infrastructure

There might be a strong temptation to build and maintain your own big data infrastructure. But that could jeopardize the mission of your analytics efforts.

“This generally wastes a lot of data scientist time on tasks other than actually developing better analytics,” says Oliver Tavakoli, CTO at cyber security company Vectra.

“We knew we wanted a lot of data to base our analytics on,” Tavakoli says. “We started by doing what everyone tells you to do: We bought a bunch of servers with lots of disk capacity, we put them in our co-location facility, we created our own Hadoop cluster on top with Apache Spark and had our data scientists write Scala code to interact with the cluster.”

The cluster would break, sometimes due to hardware failures, more often due to software failures. The software packages would get out of date and sometimes hours would go by with the cluster being unavailable.

“We finally had enough and decided to outsource this part of the problem,” Tavakoli says. Vectra went with an outside provider and has since spent little time “on the nuts-and-bolts issues, and almost all of our time has been dedicated to feeding data into the system and analyzing the data in it,” he says.


3. Be a data divider, not a data unifier

Enterprises have long struggled with the problem of “data silos” that prevent different departments from sharing information in ways that could benefit the organization overall. The same challenge applies to analytics.

A good best practice is to unify disparate data, says Jeffry Nimeroff, CIO at Zeta Global, a customer lifecycle management marketing company.

“Every data silo creates a barrier between interconnections that can yield value,” Nimeroff says. “For example, think about a rich user profile either connected or disconnected from website activity data. The more data that can be interconnected the better, as those interconnections are where predictive power can he found.”

Previous Page  1  2  3  4  Next Page