The GDPR deadline is coming up fast, and most businesses in the U.S. aren’t ready yet. Join Ken Mingis and his panel of experts as they discuss the impact of the new rules and what U.S. organizations must do now to protect customer data. Find the show here on May 17.
It’s time to hit the reset button on the gas engine. As you may already know, the electric car is now much more viable than it was ten years ago — there are charging stations in every major city scattered everywhere, particularly at hotels and along major highways. One glance at just the Tesla supercharger network of 900 stations proves that point.
Yet, to reach the point where more than half of all new cars are fully electric by 2027 — as Elon Musk predicted recently — there needs to be a massive undertaking that only the enterprise can understand. It is not a consumer endeavor but one that must be backed by IT, similar to an ERP roll-out or a massive Windows deployment.
Whether your concerns are privacy, security, competitive advantage, intellectual property or risk avoidance, your enterprise needs to be sharing — literally — as little data as possible with employees, contractors and third parties. As obvious as that statement is, it’s stunning how much data is unnecessarily shared with cloud providers and others.
There are two reasons for this. First, the time and effort needed to be remove data that the third party doesn’t truly need from the data that is needed can make the ROI seem unattractive. This is especially true when executives play down the risk of anything bad happening.
As in “I’m probably safe trusting Google/Microsoft/Amazon/Rackspace, etc.” Really? Even if you choose to assume that their security is stellar — it isn’t — what about competitive issues? Are you really willing to trust that they will handle your data with your best interests at heart?
As the blockchain continues to mature and find adoption in areas other than cryptocurrency, ERP vendors are working to integrate the distributed ledger technology as a trackable, immutable record for everything from shipping manifests and supply chains to equipment maintenance and dispute-resolution systems.
“This is very real and something we’re aggressively excited about,” said Brigid McDermott, vice president of Blockchain Business Development at IBM. “What blockchain does is provide a trust system of record between disparate companies.”
For over a decade, data has been at or near the top of the enterprise agenda. A robust ecosystem has emerged around all aspects of data (collection, management, storage, exploitation and disposition). And yet in my discussions with Global 2000 executives, I find many are dissatisfied with their data investments and capabilities. This is not a technology problem. This is not a technique problem. This is a people problem.
Those enamored of data often want to eliminate the human from the equation, but it can’t be done. And so, as climate science considers the impact of man on the environment, data science must wrestle with the inverse: the impact of data on man.
When I reviewed self-service exploratory business intelligence (BI) products in 2015, I covered the strengths and weaknesses of Tableau 9.0, Qlik Sense 2.0, and Microsoft Power BI. As I pointed out at the time, these three products offer a range of data access, discovery, and visualization capabilities at a range of prices, with Tableau the most capable and expensive, Qlik Sense in the middle, and Power BI the least capable but a very good value.
Tableau is making a big change in the way it sells its business intelligence products. The company announced Thursday that all of its software will be available as a subscription, rather than a single license plus a service fee.
Businesses will need to pay $70 per user per month for a license of Tableau Desktop Professional, and $35 per user per month for Tableau Server. That compares to the company’s boxed software prices of $2000 for Desktop, plus a $400 annual renewal fee for software updates, and $800 for Server, plus a $200 annual fee.
It’s a move that will provide additional flexibility, scalability and risk mitigation for Tableau customers, according to Francois Ajenstat, the company’s chief product officer.
Increasingly, we live in a sensor-laden world. Almost every electronic device in our lives has some kind of sensor in it, possibly many. We don’t tend to think of sensors when looking at our phones, refrigerators, cars, airplanes and buildings, but believe me, they’re in there.
And increasingly, we’re going to be aware of them. Because they’re going to take on forms — and be embedded in places — that we haven’t experienced before. And it’s going to be remarkable.
Your home as a sensor suite
Even if it’s not leaping out at you upon walking through the front door, your domicile is already armed with sensors. Appliances of all sizes and types, televisions, video game consoles, home computers and more operate using sensors of many kinds.
In a previous post I discussed the promising applications for deep learning in the enterprise. The greatest potential for deep learning is in adding business-relevant structure to less-structured, sense-like data — such as images, audio and other sensor data.
How quickly does the tone and affect of a support call from a frustrated customer change, broken down by support rep? It’s that time-to-mollification that matters to your business, not the raw sound data.
Generally when training machine learning algorithms (and deep nets are an extreme example of this), the more data the better. There’s a persistent danger of “overfitting” your data — performing very well on the training set, but poorly on new data. If the algorithm has overfit, it has failed to generalize and is thus not that useful.
When Microsoft announced that it was acquiring professional social network LinkedIn, many people gasped at the huge price tag.
The fact that Microsoft was engaged in a bidding war against Salesforce for the network helped in that regard, but most people took the perspective that the price tag was an indication of the value that Microsoft and others saw in LinkedIn’s treasure trove of data. Much like IBM’s acquisition of The Weather Channel, this deal was about tuning algorithms by using the vast amount of data available.