The next BriefingsDirect big data innovation discussion highlights how Tableau Software and big data analytics platforms come together to provide visualization benefits for those seeking more than just crunched numbers. They’re looking for ways to improve their businesses effectively and productively, and to share the analysis quickly and broadly.
To learn more, BriefingsDirect sat down with Paul Lilford, Global Director of Technology Partners for Tableau Software, based in Seattle, and Steve Murfitt, Director of Technical Alliances at HP Vertica. The discussion is moderated by me, Dana Gardner, Principal Analyst at Interarbor Solutions.
Here are some excerpts:
Gardner: Why is the tag-team between Tableau and big data so popular. Every time I speak with some one using Vertica, they inevitably mention that they’re delivering their visualizations through Tableau. This seems to be a strong match.
Lilford: We’re a great match primarily because Tableau’s mission is to help people see and understand data. We’re made more powerful by getting to large data, and Vertica is one of the best at storing that. Their columnar format is a natural format for end users, because they don’t think about writing SQL and things like that. So, Tableau, as a face to Vertica, empowers business users to self serve and deliver on a depth of analytics that is unmatched in the market.
Gardner: Now, we can add visualization to a batch report just as well as a real-time. streamed report. What is it about visualization that seems to be more popular in the higher-density data and a real-time analysis environment?
Lilford: The big thing there, Dana, is that batch visualization will always common. What’s a bigger deal is data discovery, the new reality for companies. It leads to becoming data driven in your organization, and making better-informed decisions, rather than taking a packaged report and trying to make a decision that maybe tells you how bad you were in the past or how good you might think you could be in the future. Now, you can actually have a conversation with your data and cycle back and forth between insights and decisions.
The combination of our two technologies allows users to do that in a seamless drag-and-drop environment. From a technical perspective, the more data you have, the deeper you can go. We’re not limiting a user to any kind of threshold. We’re not saying, this is the way I wrote the report, therefore you can go consume it.
We’re saying, “Here is a whole bunch of data that may be a subject area or grouping of subject areas, and you’re the finance professional or the HR professional. Go consume it and ask the questions you need answered.” You’re not going to an IT professional to say, “Write me this report and come back three months from now and give it to me.” You’re having that conversation in real time in person, and that interactive nature of it is really the game changer.
Gardner: And the ability for the big data analysis to be extended across as many consumer types in the organization as possible makes the underlying platform more valuable. So this, from HP’s perspective must be a win-win. Steve?
Murfitt: It definitely is a win-win. When you have a fantastic database that performs really well, it’s kind of uninteresting to show people just tables and columns. If you can have a product like Tableau and you can show how people can interact with that data, deliver on the promise of the tools, and try to do discovery, then you’re going to see the value of the platform.
Gardner: Let’s look to the future. We’ve recently heard about some new and interesting trends for increased volume of data with the Internet of Things, mobile, apps being more iterative and smaller, therefore, more data points.
As the complexity kicks in and the scale ramps up, what do you expect, Paul, for visualization technology and the interactivity that you mentioned? What do you think we’re approaching? What are some of the newer aspects of visualization that makes this powerful, even as we seek to find more complexity?
Lilford: There are a couple of things. Hadoop, if you go back a year-and-a-half or so, has been moving from a cold-storage technology to more to a discovery layer. Some of the trends in visualization are predictive content being part of the everyday life.
Tableau democratizes business intelligence (BI) for the business user. We made it an everyday thing for the business user to do that. Predictive is in a place that’s similar to where BI was a couple years ago, going to the data scientist to do it. Not that the data scientist’s value wasn’t there, but it was becoming a bottleneck to doing things because you have to run it through a predictive model to give it to someone. I think that’s changing.
So I think that predictive element is more and more part of the continuum here. You’re going to see more forward-looking, more forecast-based, more regression-based, more statistical things brought into it. We’ll continue to innovate with some new visuals, but the standard visual is unstructured data.
This is the other big key, because 80 percent of the world’s data is unstructured. How do you consume that content? Do you still structure it or can you consume it where it sits, as it sits, where it came in and how it is? Are there discoverers that can go do that?
You’re going to continue see those go. The biggest green fields in big data are predictive and unstructured. Having the right stores like Vertica to scale that is important, but also allowing anyone to do it is the other important part, because if you give it to a few technical professionals, you really restrict your ability to make decisions quickly.
Gardner: Another interesting aspect, when I speak to companies, is the way that they’re looking at their company more as an analytics and data provider internally and externally. The United States Postal Service view themselves in that fashion as an analytics entity, but also looking for business models, how to take data and analysis of data that they might be privy to and make that available as a new source of revenue.
I would think that visualization is something that you want to provide to a consumer of that data, whether they are internal or external. So we’re all seeing the advent of data as a business for companies that may not have even consider that, but could.
Most important asset
Lilford: From our perspective, it’s a given that it is a service. Data is the most important asset that most companies have. It’s where the value is. Becoming data driven isn’t just a tagline that we talk about or people talk about. If you want to make decisions and decisions that move your business, so being a data provider.
The best example I can maybe give you, Dana, is healthcare. I came from healthcare and when I started, there was a rule — no social. You can’t touch it. Now, you look at healthcare and nurses are tweeting with patients, “Don’t eat that sandwich. Don’t do this.”
Data has become a way to lower medical costs in healthcare, which is the biggest expense. How do you do that? They use social and digital data to do that now, whereas five, seven years ago, we couldn’t do it. It was a privacy thing. Now, it’s a given part of government, of healthcare, of banking, of almost every vertical. How do I take this valuable asset I’ve got and turn it into some sort of product, market, or market advantage, whatever that is?
Gardner: Steve, anything more to offer on the advent or acceleration of the data-as-a-business phenomena?
Murfitt: If you look at what companies have been doing for such a long time, they have been using the tools to look at historical data to measure how they’re doing against budget. As people start to make more data available, what they really want to do is compare themselves to their peers.
If you’re doing well against your budget, it doesn’t mean to say you gaining or losing market share or how well you’re doing. So as more data is shared and more data is available, being able to compare to peers, to averages, to measure yourself not only internally, but externally, is going to help with people making their decisions.
Gardner: Now for those organizations out there that have been doing reports in a more of a traditional way that recognize the value of their data and the subsequent analysis, but are yet to dabble deeply into visualization, what are some good rules of the road for beginning a journey towards visualization?
What might you consider in terms of how you set up your warehouse or you set up your analysis engine, and then make tools available to your constituencies? What are some good beginning concepts to consider?
Murfitt: One of the most important things is start small, prove it, and scale it from there. The days of boiling the ocean to try come up with analytics only to find out it didn’t work are over.
Organizations want to prove it, and one of the cool things about doing that visually is now the person who knows the data the best can show you what they’re trying to do, rather than trying to push a requirement out to someone and ask “What is it you want?” Inevitably, something’s lost in translation when that happens or the requirement changes by the time it’s delivered.
You now have a real-time, interactive, iterative conversation with both the data and business users. If you’re a technical professional, you can now focus on the infrastructure that supports the user, the governance, and security around it. You’re not focused on the report object anymore. And that report object is expensive.
It doesn’t mean that for compliance things the financial reports go away, it means you’ve right sized that work effort. Now, the people who know the data the best deliver the data, and the people who support the infrastructure the best support that infrastructure and that delivery.
It’s a shift. Technologies today do scale Vertica as a great scalable database. Tableau is a great self-service tool. The combination of the two allows you to do this now. If you go back even seven years, it was a difficult thing. I built my career being a data warehouse BI guy. I was the guy writing reports and building databases for people, and it doesn’t scale. At some point, you’re a bottleneck for the people who need to do their job. I think that’s the biggest single thing in it.
Gardner: Another big trend these days is people becoming more used to doing things from a mobile device. Maybe it’s a “phablet,” a tablet, or a smartphone. It’s hard to look at a spreadsheet on those things more than one or two cells at a time. So visualizations and exercising your analytics through a mobile tier seem to go hand in hand. What should we expect there? Isn’t there a very natural affinity between mobile and analysis visualization?
Lilford: We have mobile apps today, but I think you’re going to see a fast evolution in this. Most visuals work better on a tablet. Right-sizing that for the phone is going to continue to happen, scaling that with the right architecture behind it, because devices are limited in what they can hold themselves.
I think you’ll see a portability element come to it, but at the same time, this is early days. Machines are generating data, and we’re consuming it at a rate at which it’s almost impossible to consume. Those devices themselves are going to be the game changer.
My kids use iPads, they know how to do it. There’s a whole new workforce in the making that knows this and things like this. Devices are just going to get better at supporting it. We’re in the very early phases of it. I think we have a strong offering today, and it’s only going to get stronger in the future.
Gardner: Steve, any thoughts about the interception between Vertica, big data, and the mobile visualization aspect of that?
Murfitt: The important thing is having the platform that can provide the performance. When you’re on a mobile device, you still want the instant access, and you want it to be real-time access. This is the way the market is going. If you go with the old, more traditional platforms that can’t perform when you’re in the office, they’re not going to perform when you are remote.
It’s really about building the infrastructure, having the right technology to be able to deliver that performance and that response and interactivity to the device wherever they are.
Gardner: Before we close, I just wanted to delve a little bit more into the details of how HP Vertica and Tableau software work. Is this an OEM, a partnership, co-selling, co-marketing? How do you define it for those folks out there who either use one or the other or neither of you? How should they progress to making the best of a Vertica and Tableau together?
Lilford: We’re a technology partnership. It’s a co-selling relationship, and we do that by design. We’re a best-in-breed technology. We do what we do better than anyone else. Vertica is one of the best databases and they do what they do better than anyone else. So the combination of the two, providing customers options to solve problems, the whole reason we partner is to solve customer issues.
We want to do it as best-in-breed. That’s a lot what the new stack technologies are about, it’s no longer a single vendor building a huge solution stack. It’s the best database, with the best Hadoop storage, with the best visualization, with the best BI tools on top of it. That’s where you’re getting a better total cost of ownership (TCO) over all, because now you’re not invested in one player that can deliver this. You’re invested in the best of what they do and you’re delivering in real-time for people.
Gardner: Last question, Steve, about the degree of integration here. Is this something that end user organizations can do themselves, are there professional services organizations, what degree of integration between Vertica and Tableau visualization is customary.
Murfitt: Tableau connects very easily to Vertica. There is a dropdown on the database connector saying, “Connect to Vertica.” As long as they have the driver installed, it works. And the way their interface works, they can start query and getting value from the data straight away.
You may also be interested in:
- Big Data Helps Conservation International Proactively Respond to Species Threat in Tropical Forests
- How Globe Testing helps startups make the leap to cloud- and mobile-first development
- GoodData analytics developers on what they look for in a big data platform
- ITIL-ITSM tagteam boosts Mexican ISP INFOTEC’s operations quality
- Novel consumer retail behavior analysis from InfoScout relies on HP Vertica big data chops
- IT Operations Modernization Helps Energy Powerhouse Exelon Acquire Businesses
- ECommerce portal Avito uses big data to master rapid fraud detection
- How a Hackathon Approach Juices Innovation on Big Data Applications for Thomson Reuters
- How Waste Management Builds a Powerful Services Contiunuum Across Operations, Infrastructure, Development, and IT Processes
- GSN Games hits top prize using big data to uncover deep insights into gamer preferences
- Hybrid cloud models demand more infrastructure standardization, says global service provider Steria
- Service providers gain new levels of actionable customer intelligence from big data analytics