Tableau bold move during Tableau Conference 16′

Written by Adrien Charles – 27-11-2016




Tableau’s Keynote is one of my favourite sessions during a Tableau Conference. This is when you find out about the long-term vision and new features planned for the next 18-24 months. I closely followed the conference online and I have to say, this year brought some game-changing features and products. A new product was announced alongside many awesome functionalities and other cool things you should know about. Here are the main takeaways from this year’s conference in Austin:


ETL is coming…

Yes, you read that right. Tableau has taken a shift in its approach and will offer another product which is neither Tableau Desktop nor Tableau Server! The project name was LOOM, internally and has been named Maestro, for the public. Today the amount of ETL you can perform in Tableau Desktop is not good enough and Tableau knows it is its weak spot. The need for an ETL tool has left some space in the industry for companies to emerge, Alteryx for instance. During TC 16’, this situation came to an end and Tableau unraveled its firm intention to become a full Stack BI platform (Database / ETL and visualisation). The Seattle-based company does not want to be seen as only a visualisation company anymore.


Maestro’s demo was 10 minutes long. The new product will be reusing what has made the Tableau suite a success for the last couple of years: it is visually appealing and seems easy to use. We can only say “seems” since not much has been shown yet. In my opinion, the demo was rather simplistic for an ETL tool. Some functionalities were showcased like, “seeing” the dataset quicker, aggregating to a certain level and then blending between different data sources. I would have liked to see more: what about filtering, more complex schema, calculations, etc.? We have some important questions unanswered here: will Tableau Server be able to host the ETL process, is the output a Tableau extract only? We are not really sure what will be shipped in terms of functionalities but Tableau is going in the right direction and it is a big move.


The release date has been set up for early 2017. We do not yet know if this product will be sold separately or shipped with the Desktop licences.


Hyper, the new Tableau engine…

Tableau extracts are based on a technology that is getting rusty and Tableau realises it’s time for a change. The answer has been given the name Hyper and will probably become the new Tableau extracts format, way faster and more scaleable. The demo started with a quick intro on a 300-million-row dataset with Hyper as a data source. Things were fast and smooth whilst doing a couple of drag and drop. However, millions of rows don’t impress anyone anymore… Then the big numbers appeared. Tableau was then plugged to billions of rows to showcase the power of the recently acquired database technology. A handful of seconds’ latency maximum was necessary to query 3 billion rows. Pretty neat. New data started coming in and the results were updated live. It all looked pretty good and fast, as you would expect from a nicely prepared demo.


However, Hyper is still a prototype and far from ready to be used by anyone. No user interface or detailed description of how it will be integrated with the rest of the Tableau suite has been revealed. I am not sure if Hyper will be a standalone database like MySql, or simply replace the Tableau extract technology. On stage it seemed that we could do some data entry. Will that functionality be available to the user? We all certainly hope so. We will have to be patient since nothing will be ready until mid-2017.


Natural language processing

PowerBI has this functionality already in place and Tableau is playing catch-up (about time too). It will soon be possible to write sentences to get answers in real time and get the answer displayed visually. This type of functionality is all about execution. It can become a super useful tool and allow a user with no knowledge about BI to get answers. Bad execution will make this features useless. Tableau prefers to take its time to do better than its strongest opponent, Microsoft. Let’s see what the team of developers can do to outsmart the tech giant. Game on!


What is true, what is not?

Certifications were announced during the keynote. The brilliant idea behind a certification is to help people when navigating through a lot of data and reports. You may not have encountered this if your Tableau deployment is not large enough. But trust me, it can become hard to find a trusted report or data sources on a messy Tableau server. This is where certifications will come to the rescue. Typically available only for IT/BI team, experts in the dataset field will leave their mark. They will be able to tag a field (like a calculation),  a datasource or even a dashboard, to certify the veracity of the component. This should solve a lot of issues when things become too large. IT will be the first one to jump on this feature and it will give some control back to IT.


Collaboration will be pushed to a whole new level


Tableau night out during Tableau Conference 2015

The data company will make all your conversations, questions, data, report or alerts go through their products. Tableau’s mission just got larger. If it is data related, it should be in Tableau. Today, when you have a question about a report, you send an email or a chat message to the report owner and leave the Tableau interface. That will soon disappear and you will find a real collaboration space right next to where the data sits. You will be able to mention someone in the chat space and start a discussion right there, where it should be. I am really excited about this and cannot wait for this to be applied. There is plenty of room for improvement here and Tableau is looking to transform how people collaborate with data.


Give me some privacy

A new concept has been introduced called Metrics. To make it simple, it will allow you to grab pieces of dashboards and put them into one place to avoid checking 5 different dashboards. Pretty cool. It allows you to not have to re-design a dashboard when you need to combine several things. A space called sandbox will also be added. A personal space for users, a place where people can save their own content, available only to them.


I want more…

Tableau has revealed even more! All the below have been announced, for our ultimate pleasure:

  • When changing something on a published datasource, you will, in real time, observe the impact on other reports that are using what you want to modify.
  • The possibility to connect live to your on-premise datasource via Taleau Online, perfect for cloud hybrid architecture.
  • Pre-modelled data source when connecting to Salesforce, Marketo, quickbooks or Eloqua.
  • The possibility to modify a datasource on the server, no need to republish it!
  • Proper alert system and notification when on mobile.
  • Suggested report or datasource schema when working on the desktop. Awesome stuff, no need to redo someone else’s work.
  • The online version of the Tableau desktop will completely catch up with the on-premise version within the next 6 months.
  • Tableau server on linux is coming soon! Huge news for many customers…


In brief, Tableau is enlarging its mission

Tableau is at a crutial point in its lifecycle and needed to send a strong message. In my opinion, the company has sucessfully detailed it’s master plan. It has sent the right message to the market: Tableau will be much more than a visualisation tool and is moving on from just being a desktop application. Let’s hope the new functionalities will arrive soon enough and meet the high level of expectation. In days following the conference, the stock-price recovered from its previous loss after the Q3 earning call, giving a positive feeling about the upcoming functionalities.

What are your thoughts about the Tableau Conference 2016?


Leave a Reply

Your email address will not be published.