For over a decade, data has been at or near the top of the enterprise agenda. A robust ecosystem has emerged around all aspects of data (collection, management, storage, exploitation and disposition). And yet in my discussions with Global 2000 executives, I find many are dissatisfied with their data investments and capabilities. This is not a technology problem. This is not a technique problem. This is a people problem.
Those enamored of data often want to eliminate the human from the equation, but it can’t be done. And so, as climate science considers the impact of man on the environment, data science must wrestle with the inverse: the impact of data on man.
Tableau is making a big change in the way it sells its business intelligence products. The company announced Thursday that all of its software will be available as a subscription, rather than a single license plus a service fee.
Businesses will need to pay $70 per user per month for a license of Tableau Desktop Professional, and $35 per user per month for Tableau Server. That compares to the company’s boxed software prices of $2000 for Desktop, plus a $400 annual renewal fee for software updates, and $800 for Server, plus a $200 annual fee.
It’s a move that will provide additional flexibility, scalability and risk mitigation for Tableau customers, according to Francois Ajenstat, the company’s chief product officer.
Increasingly, we live in a sensor-laden world. Almost every electronic device in our lives has some kind of sensor in it, possibly many. We don’t tend to think of sensors when looking at our phones, refrigerators, cars, airplanes and buildings, but believe me, they’re in there.
And increasingly, we’re going to be aware of them. Because they’re going to take on forms — and be embedded in places — that we haven’t experienced before. And it’s going to be remarkable.
Your home as a sensor suite
Even if it’s not leaping out at you upon walking through the front door, your domicile is already armed with sensors. Appliances of all sizes and types, televisions, video game consoles, home computers and more operate using sensors of many kinds.
In a previous post I discussed the promising applications for deep learning in the enterprise. The greatest potential for deep learning is in adding business-relevant structure to less-structured, sense-like data — such as images, audio and other sensor data.
How quickly does the tone and affect of a support call from a frustrated customer change, broken down by support rep? It’s that time-to-mollification that matters to your business, not the raw sound data.
Generally when training machine learning algorithms (and deep nets are an extreme example of this), the more data the better. There’s a persistent danger of “overfitting” your data — performing very well on the training set, but poorly on new data. If the algorithm has overfit, it has failed to generalize and is thus not that useful.
The fact that Microsoft was engaged in a bidding war against Salesforce for the network helped in that regard, but most people took the perspective that the price tag was an indication of the value that Microsoft and others saw in LinkedIn’s treasure trove of data. Much like IBM’s acquisition of The Weather Channel, this deal was about tuning algorithms by using the vast amount of data available.
Ted Friedman, vice president and analyst at Gartner, predicts the following three trends will drive fundamental changes in the use of data and analytics:
Instead of just reflecting business performance, data analytics will become the driver of operations.
Data and analytics will become infused in an organization’s architecture from end to end, creating a holistic approach — and this will include strategic project management in EPMOs (enterprise program management offices).
Executives will use data and analytics for business strategy and growth, creating new additional roles for professionals.
Experts share insights on how data improves project performance
Companies of all sizes have been using data analytics to seek out opportunities, reduce costs, create efficiencies, make better and faster decisions, and ultimately increase customer satisfaction; this also translates at the project, program and portfolio levels since these greatly enable company-wide strategy.
At the Chicago Bulls, Matthew Kobe, drector of analytics, says its Business Strategy and Analytics team uses consumer insights to drive the strategic direction of the organization. They use data analytics to focus on three key areas of insight — fan levels, business transactions and digital engagement — to inform the organization’s strategic choices. He shares more about their focus on the three areas below:
Fan Level Insights — The Bulls are building a robust CRM and data warehouse solution that delivers a more holistic view of our fans. “We seek to understand psychographic elements that help us to understand why a person is engaging and transacting with the Bulls,” says Matthew. They also want to “understand satisfaction and areas for improvement by capturing fan specific feedback on all elements of the fan experience.”
Transactional Insights — The team analyzes all business transactions including ticketing, concessions and merchandise, and wherever possible, Matthew says “We tie these transactional elements back to the fan to build out a more complete customer view.”
Digital Engagement Insights — “The Bulls have a significant digital presence illustrated by the second largest social media following for any sports team in North America,” says Kobe. Because of this, they work to understand the types of content fans are engaging with and how those engagements drive their fans downstream behaviors. They again make every effort to link engagements back to the fan to help their continued effort to further expand on their customer view.
“With these three areas under our purview, we are able to more effectively influence change across the organization. Specifically, we have impacted nearly every area that influences a fan’s experience with the Bulls: Ticketing, Sponsorship, Digital Content, Marketing, and Concessions,” he says.
Jason Levin, vice president of Institutional Research at Western Governors University (WGU), also shared how they use data analytics to create project wins. “Conceptually, the most important data for project success is having a measurement plan that includes implementation fidelity and efficacy,” he says.
He suggests answering this question: “How do we know we are doing what we intended to do?” and “How do we know if what we did worked?” Jason elaborates further on their methods for measuring implementation fidelity and efficacy.
For implementation fidelity, WGU has used many methods, ranging from analyzing log data of student sessions with electronic learning materials to having faculty use hashtag notations in the student notes.
For efficacy, “our bias is to use randomized control trials, but we also use quasi-experimental methods. The most important data is to have a clearly defined outcome variable that can be reliably measured. Western Governors University (WGU) has a competitive advantage with outcome variables compared to traditional higher education institutions. At WGU, all our assessments are centrally developed to rigorous standards. This system of assessment produces much more reliable data than having faculty individually assigning letter grades.”
He also describes another unique aspect of data at WGU – its “domain taxonomy or hierarchy of learning outcomes mapped to learning materials and assessments. Student learning behavior can be mapped between the electronic course materials and assessment. Formative assessment data is more predictive of success on the high stakes assessment than simple pageviews.”
To make the best decisions, companies need to be able to extract precise and relevant information from the data available. Absent this, raw data, no matter the quantity, serves no purpose. Ultimately, companies are seeking the type of information that tells them what their customers want most and is critical for guidance on project initiatives, direction, execution, and metrics.
How are companies using data analytics to improve project outcomes?
No matter what the industry, from technology to sports or education, data analytics has become an essential tool for enabling successful project outcomes and ultimately company-wide strategy.
“We use data analytics to examine almost everything about our platform, including how many times our users request customer support, says Jonathan Rodriguez, Founder, and C.E.O. at BitMar Networks. “The first thing that we realized was the more solutions we offered before our users even requested them, the less our users requested customer service”.
He has confidence that by implementing data analytics, BitMar found a completely new approach to recruiting. Data told BitMar that “your users do not need your tech support, they prefer to talk to one another, instead. So, provide that functionality and let them be.” This highlighted the need for the company to hire community enthusiasts instead of customer service staff.
BitMar embarked on a project to develop a self-help platform for customers. “Who would have thought that we would have been able to provide a platform in which the users get to help themselves, at virtually zero cost on our end?,” says Rodriguez. Data analytics not only helped BitMar zoom in on the types of projects they should be taking on, it also identified opportunities within projects to improve customer satisfaction and still reduce internal costs.
Jason Levin (WGU) says that “probably the most successful project to date has been the Leadership and Communication course designed to educate students along the affective domain. Using quasi-experimental methods, we demonstrated significant improvement in retention and credit accumulation. Based on that research the course was implemented in the undergraduate Health Professions programs, which now serves about 1,000 students per month.”
When it comes to the Chicago Bulls, Matthew says, using fan level and transactional insights to do an initial customer segmentation of their ticket buyers was a top priority over the last year. “We wanted to understand whether we had any vulnerabilities across segments and any gaps in our product portfolio.” Specifically, he says they identified opportunities to further develop fans that fall into the young professionals and families segments and took fan level insights to further build out personas for these segments to help functions understand how to engage them.”
Further, the Bulls used these consumer insights to accomplish the following:
Identify opportunities to further develop each of the segments
As the functions built out strategic plans, the Strategy and Analytics team was able to partner with them to establish metrics to evaluate success
Develop a new charity event targeted at young professionals and,
Make modifications to ticket products with a greater emphasis on creating Bulls Snapchat content
At a tactical level: “We used a subset of fan level insights to evaluate the likelihood to buy for potential ticket buyers. We use available demographic information combined with prior purchase history and digital engagements to evaluate a customer’s purchase intent and the product that would best fit their needs,” says Matthew. Using this information, their group was are able to significantly increase efficiency with sales reps and deliver the products customers desired.
What are the limitations of working with data analytics?
The Chicago Bulls Strategy and Analytics team learned two important lessons.
They had to begin with “why.” -Why do we want to capture certain data points and what are the resulting use cases. “We have very limited opportunities with our most important fans to capture data. We need to ensure that we are capturing data that will advance our consumer insights and provide opportunities to more personally engage our fans in the future,” says Matthew.
Finding the right time to use technology to sustain and accelerate a process. “We have found that leading with technology results in lower adoption and force fitting the technology into a less efficient process. By outlining the process and bootstrapping an analytical solution, we are better equipped to evaluate technology options and select one that really pushes the organization forward.”
“There is a quote that has been attributed to Albert Einstein that says ‘not everything that can be counted counts, and not everything that counts can be counted,'” says Jason Levens, of Western Governors University. “In education this is very true. Understanding what is going on with student and faculty psychology is critically important but difficult to measure. This is especially true if you are trying to measure these concepts in real-time and not relying on survey instruments. It is clear, by the research generated by scales like grit or mindset how important these data are to educational outcomes.”
With the data analytics and project management industries growing at an explosive rate, it only makes sense to use both powerful tools in combination and interwoven into a company’s fabric to create a more sustainable competitive advantage.
Chief data officers (CDOs) are among the most highly sought-after executives among corporations for whom data analytics has become a cornerstone of digital strategies. But the rush to promote data-crunching experts to the CDO role has created a new challenge: Finding a leader who can use data to help drive a business transformation.
Companies eager to establish data analytics have promoted managers to the CDO role based on their technical wizardry rather than their leadership capabilities, says Joshua Clarke, partner for executive recruiter Heidrick & Struggles, who highlighted the problem in “Choosing the right chief data officer,” a new report detailing the rapid evolution of the CDO role.
Businesses have more data than ever about their operations, supply chains and customers. The problem is often they can’t see it, don’t know where it is, and don’t have an easy way to pull it all together and analyze it. So, they are unable to make smart decisions and can lose thousands of dollars a year.
OnDemand does the “dirty work” of collecting data from food distributors, cleaning the data, analyzing it and putting the information front and center for supply chain restaurant managers, said Jeff Dorr, chief customer officer of ArrowStream.
SAP has added some new capabilities to SAP Vora, its in-memory distributed computing system based on Apache Spark and Hadoop.
Version 1.3 of Vora includes a number of new distributed, in-memory data-processing engines, which accelerate complex processing, including ones for time-series data, graph data and schema-less JSON data.
Common uses for the graph engine might be analyzing social graphs or supply chain graphs, said Ken Tsai, SAP’s head of product marketing for database and data management.
One application that would benefit from the new time-series engine is looking for patterns of electricity consumption in smart metering data.