Tableau’s annual Customer Conference 2016 (#tc16) is currently happening in Austin, Texas. Tuesday’s opening keynote was about Tableau’s three year roadmap, dubbed “Future Vision”. During the keynote, Tableau founder Christian Chabot, and senior company leaders shared the company’s vision and upcoming innovations.
Let’s go through some of the announcements in the roadmap. You can check out the full keynote on demand at the Tableau Conference website.
Back in March 2016, Tableau announced (rather quietly I might add) that they were acquiring HyPer, a high performance database system originally developed by the Technical University of Munich, in Germany. What makes Hyper different to any other new database system is that it’s designed for simultaneous OLTP and OLAP processing without compromising performance. This hybrid approach is designed to blend transactions and analysis, taking Tableau closer to the transactional data. If you’re like me, and have been living in a world where those two systems are SAP and Business Warehouse, that might sound a tad crazy.
The idea behind the acquisition made more sense during the keynote, when the Hyper Data Engine was demoed by loading over 3 billion rows of data into the database and doing some analyses while the data was being loaded. Hyper will be available to use for all the data sources Tableau supports, so in the future any extracts loaded to the Data Engine are going to super fast. I envision deploying a combination of live connections to services like Amazon Redshift or Google BigQuery, and extracts from the Hyper DB. The beta for Hyper will be starting during the beginning of next year. Can’t wait.
This one wasn’t really a surprise to be honest, since I’ve been kind of expecting Tableau to announce a data-prep tool. Project Maestro is Tableau’s new data prep tool that brings some much needed ‘oomph’ to getting your data ready for analyses. The idea with Maestro is that you can do more robust data prep and joining, as well as clean the data before you take it into Tableau.
Naturally the new tool integrates into the current Tableau platform, so you can deliver the data to the cloud, to your server or desktop. With Maestro there’s still a bit of a wait before it’s even in beta, but since this is a three year roadmap it’s cool.
When deploying Tableau Server, it’s natural that IT will start getting skittish about the Self-Service aspect of Tableau and what data sources are published. The data connections to the source systems should be easily governed as well as defined properly. The new governance features of Tableau Server will allow for end users to see which data sources have been defined properly, what fields are part of the data source, and my favourite, the ‘Certified by…’ flag.
This means that users will be able to rely on the results of their analyses since the data in the source is certifiably accurate. Also more information at the data source level will be added to Tableau Server, which gives usage information about the data source. Changing the data source fields by dragging and dropping will also available, and look very intuitive. There will also be some changes to the Data Server, which will bring more insight into the use of data sources.
These features are something I’m really happy about. I love using scatter plots and clusters, but to when you need additional information from data points, currently you need to define the ‘additional’ visualisations separately and put them somewhere on your dashboard. The new Selection Summaries feature will generate useful information from your selection into the tooltip using the dimensions you’re using in that viz.
The same thing goes for Clusters, which was a new feature introduced in Tableau 10 (if you haven’t read the relevant blog post from my series 10 things about Tableau 10, go do it now — I’ll wait). New features coming to Tableau during the next three years include cluster summaries, which use the tooltip to display summarised information about the clusters in your viz.
I also love maps, who doesn’t? Maps will be getting some excellent upgrades, such as Custom Geographic Roles, which allow you to bring Borough and Neighbourhood aggregation on the fly to your maps. This allows for an additional layer of analysis since individual latitude/longitude points on a maps are often times way too specific.
Map layers I’m also very excited about. Currently, you can only have 2 ‘layers’ on a map if you duplicate the latitude/longitude, and create a dual axis map. With the new map layers, you will be able to add all kinds of useful information to the maps in Tableau. Stellar!
Natural Language Processing
I’m adding this as a separate point, even though during the keynote it was part of the Instant Analytics demo. Natural Language Processing as a concept is very useful, and allows users to to ‘ask’ questions of their data in ‘normal English’. Unfortunately, it almost always doesn’t work.
Ambiguity, colloquial expressions, and having a straight conversation with your data have always been a challenge, which a question like ‘How much did we sell last summer?’ is a difficult question to infer a SQL statement from. If Tableau has been able to crack this tough nut, it would make for a powerful addition to all the Analytics tools in Tableau.
Tableau Server ♡ Linux
Tableau Server is coming to Linux. And there was much rejoining. According to the keynote announcement, all major distributions will be supported. Package management tools like yum and apt will also be supported. Easy migration from Windows Server to Linux will be available by backing up your server and restoring it in Linux.
Last but not least
Other new innovations in the pipeline include Recommendations, which in essence are report templates that are generated from your data using machine learning algorithms. Remains to be seen how relevant it will be in real production use but it’s neat that machine learning is becoming more common place.
Data-driven alerts have for the most part been obtainable in Tableau only with the VizAlerts community project. It requires a fair bit of tinkering, which does not make it fully production ready. Now with the Tableau Vision announcement comes integrated data-drive alerts. These alerts will be bound to measures and will send out a notification if the metric exceed a predefined threshold.
That’s pretty much the key points, which I found interesting. Don’t forget to watch the full keynote from the Tableau Conference website for all the demos.
Are you a data driven organisation? What does it mean to be Data Driven? Come and find out at the Bilot Breakfast Club on the 24th of November 2016, “HOW CAN YOUR ORGANISATION BECOME DATA-DRIVEN?”. Click on the link and sign up today!
To download a free 14 day trial of Tableau 10, go to our trial download page and discover Tableau 10 for yourself!