All or nothing

Last year I participated in Makeover Monday around 20 times. That’s not even 50% of the vizzes. This year I committed to not only submitting a viz every week, but to also write a blog post for each viz, find articles and data for 26 of the weekly challenges, write 26 weekly recap posts, and engage in the social media fun that comes with it: tweets, retweets, responses, messages, likes, sharing elsewhere and spreading the love.

Absolutely no regrets! Ever since week 1 it’s been great fun and I’m really enjoying it, because I get to be so much more involved in the community.

When I do something I find it easier to commit 100% or not at all. So doing every makeover for me is an easier commitment than saying ‘I’ll do the makeovers that look fun’ or just focusing on the topic that is easiest to visualise etc.

Joining Andy in running the project meant that I have made it part of my daily and weekly routine. Yes, there is a lot of ad hoc work, especially around Twitter, but certain tasks have a reasonably set schedule: Sunday is time to post the data. Doing it every 2 weeks means I can spend the ‘off’ weeks searching for good MM articles instead of preparing data. Sunday afternoon I usually spend doing my Makeovers because I don’t have time to do so at work on Mondays. Unless there is an opportunity to do a ‘Live Makeover Monday event’ like we did in London last week.
While I build my viz, I also draft the blog article that goes with it, including my critique of the original article and chart. Then it all gets published.

Once the data is out there, the real fun starts. Don’t get me wrong, I love building the vizzes and I also enjoy writing about the process, but the really fun aspect of this whole project is talking to you guys, the community, via Twitter about the submissions. It’s hard not to miss a submission as they come in very quickly on Sunday evening and then on Monday it’s a real deluge :-). But I love that. I wake up on Monday morning and my Twitter feed is full of vizzes, tweets and retweets from the Makeover Monday crew.

After getting through the first wave of that and eating some breakfast before I start work, everything starts humming at a more consistent pace and there are usually a number of discussions along the way. And of course as the week progresses we get to choose our favourites from all the submissions for our Friday recap blog post. In between Andy and I chat about Makeover Monday related stuff pretty much every day: new ideas for MM challenges, changes to the website, data we have found, interesting initiatives that would align with MM, potential live MM events, etc.

And then just like that it’s Friday, the second ‘hump’ of the week when we publish the recap post before settling into the weekend and gearing up for the next challenge to be released.

It’s a great project and I love being part of it. Compared to last year when I submitted vizzes every now and then (with a big hiatus over summer), this year my commitment has been much easier to maintain. Doing it every week is easier. It’s all or nothing. And that’s how I like it 🙂


Alteryx – let’s start right at the beginning…

As promised, I’ve decided to kick off a series of blog posts focusing on Alteryx as I go about learning how to use the tool most effectively for my data adventures in 2017.

Before I get into the technical details, I should back up a little and talk about ‘me and tech’ to help you understand the tone and content of these articles better. (if you don’t like detours to the 90’s, feel free to skip to the instructions further down)

Tentative first steps

Growing up, we’ve had a computer for as long as I can remember and I was lucky enough to start using the internet in 1995 at the tender age of 10 to research information on gerbils (by necessity: I ended up with a pair rather than brothers and needed to figure out a way to control the reproduction efforts they went to before the house was overrun by cute little brown mice).

I’ve always enjoyed the world that was opened up to me through the internet and using technology from an early age.

But no one ever taught me the really techie bits. I had many friends who worked as DBAs, network or system engineers and some even founded their own tech companies that are still going strong today. But I was more the ‘mascot’ of the club than a contributing member.

I tried to learn how to code but never went beyond ‘Hello World’. It just didn’t really appeal to me and reading heavy books with no clear idea of what I could do with the knowledge and skills afterwards.

This was around the year 2000 and the internet certainly wasn’t the same as the one we know and use today.

What this meant for me was that I never considered a career in IT or a degree in a tech subject. A bit of a regret to be honest, not at least giving it a try, but from where I stood it didn’t look like my kind of thing.

The downside is that for me using a new tool always causes a bit of anticipation.


For one, I don’t want to break anything. But more importantly, I’m worried that I’ll look stupid when I ask someone a question and don’t understand the answer. Sounds mad? Maybe, but to me a lot of the tech talk is like a foreign language and people assume I have certain knowledge which I don’t. So they use terms which make me just smile and nod while scratching my head and heading straight into a google search after our conversation.

Or someone tells me “just do A, then B, then E and you’ll get to the solution”. Hold on there! How do I do A? Why do I have to do B? And can you please tell me about steps C and D so I can actually get to E?

I have found a way to solve this problem: Find a friendly person who explains a process to me once from start to finish with all the detail before I go off and do it on my own. You see, I’m not dumb or slow. I just need a decent explanation once and then I can be left to my own devices. Usually. In high school I had really shoddy grades in maths. But not because I’m stupid, because once I got a private tutor who explained the material in a way that I understood, I managed to finish my secondary education career with a very decent 12 out of 15 possible points. Not bad after many years of suffering.

How this applies to Alteryx

The way I approached Alteryx was a bit similar. Minus the suffering.

At first when I tried to build a workflow, I looked in the online help and played around in the tool but couldn’t get anything to work. Yes, I hear you, it’s just ‘drag and drop’. But which of the gazillion tools am I meant to use? And what’s the tool configuration all about?

People say it’s very intuitive, but to me it certainly wasn’t. I’ll have to be honest there. I guess I was used to the simplicity and instant visual feedback I enjoyed in Tableau.

Alteryx was an enigma for me, so I closed it and rather disappointingly gave up for a few days.

Thankfully I had to use it for work, so there was no way around figuring it out. So I sat down with my colleague who’d been using it a while and got him to walk me through building my first workflow.

He pointed out all the essential configuration details, the tools I’d need for my specific task and let me play with it. And voila, I was able to successfully blend and transform some data and publish it to different types of output files. That first success was all I needed to continue with some more enthusiasm.

In one of my work projects I had to create a lot of Alteryx workflows to pick up data from many different sources and turn it into a consistent output for consumption by Tableau. That was fun and a good opportunity to get more practice.

Re-acquainting myself after a bit of a break

After a lengthy break (moving countries and changing jobs) during which I didn’t use Alteryx at all for several months, I am now back in the game.

In my day job I obviously deal with data a lot as well, so there’s a good chance that I’m using Alteryx regularly to blend, enhance and enrich datasets for analysis. But mostly my Alteryx practice comes from Makeover Monday. As I obtain datasets for Makeover Monday challenges, I transform them into .tde files at a minimum, but sometimes I need to do a bit of tidying up along the way, so I get a fair bit of practice doing that.

The second week of Makeover Monday was the first week in which I chose the dataset and it was a nice and simple one. I obtained the original data as an .xlsx file from Statista (add hyperlink) and then used Alteryx to turn it into a .tde as we usually publish both formats.

Yes, I could just open the .xls in Tableau and save it as a .tde but why not just get into the habit of using Alteryx for the transformation?

Using Alteryx to speed up my Makeover Monday data prep

In the next couple of paragraphs I will explain what I did in Alteryx and what the results were for each step.

To many this will seem super-basic, but there might be a couple of people just starting out with the tool and having similar reservations to those I held when it was all new to me.

Not to worry, I’ve got you covered ;-). I’ll promise to break everything down as far as possible and give you the ‘why’ as well to help you understand the reasons behind choosing certain tools and changing configurations.

I will assume no prior knowledge because, well, you know why… 😉

Using Alteryx to convert an Excel spreadsheet into a .tde file

First you’ll need an Excel spreadsheet you’d like to transform. Pick something nice and simple: a few columns and rows with no fancy formatting or so (remember we’re just getting started here…). Start with a new workflow in Alteryx. Your screen should look like this:


At the top of the screen you have a menu with all the tools you can use. For this workflow we will only use three of them.

On the left-hand side you can see the configuration menu where we will adjust the setting for each tool as required. The white space in the middle is your canvas where you’ll drag your tools. And finally there is the results window at the bottom of the screen. This is where you can see whether what you did worked out J.

Tools required for this workflow

In the workflow for my Makeover Monday data I needed only three tools:

  • Input Data: this is where I choose the Excel file that contains my data
  • Select: this is where I remove unnecessary fields from the dataset and change the data type of the fields I want as needed
  • Output Data: this is where I tell Alteryx to create a .tde file of the data in a specific location

Let’s build it!

Step 1: Drag the Input Tool onto your canvas

Step 2: configure the Input Data Tool

Step 3: add the Select Tool

Step 4: configure the Select Tool

Step 5: add the Output Data Tool

Step 6: configure the Output Data Tool

Step 1: Drag the Input Tool onto your canvas



Step 2: Configure the Input Data Tool

We now have to configure the tool, which in this case simply means telling Alteryx where to pick up the Excel file from.


a) Click the downward button and select ‘File Browse’


b) Choose your Excel file from its location


c) Pick the sheet (if applicable) and click OK


d) Alteryx will give you a preview of the data in the configuration window on the left-hand side


e) Press the ‘Run Workflow’ button to see the data in your results window


f) You’ll be able to see the result below your workflow


Step 3: Drag the Select Tool onto your canvas and connect it to the Input Data tool


Step 4: Configure the Select Tool

We now have to configure the tool, which allows us to pick and choose which fields stay in the dataset as well as what data type those fields should be.

Go from this:


To this:


First I changed the year field to an Int16 to change it to a whole number rather than a decimal number.  Then I unselected the fields F4 and F5 because they are emtpy and not required.

That’s it. Run the workflow again and see the results in the results window…


Stay with me, we’re almost done here.

Step 5: Add the Output Data Tool

The data has to go somewhere and that somewhere is going to be a .tde file. First we add the Output Data tool to our workflow:


Step 6: Configure the Output Data Tool

a) Let’s tell Alteryx where to save the .tde and all that stuff. This is similar to the input steps…


b) select a location for your file and give it a name…


c) choose your Output option: when you run the workflow, do you want to overwrite the data, append new data or create a new file?

step 6c.png

d) Run your workflow and you’re done!



And that’s all there is to it. Not that tricky, right?

I obviously went into a LOT of detail here but if you’ve never used Alteryx before you may find this useful and next time we can skip over half the detail and just focus on the punchy things.

I am excited about my next Alteryx post because I have something really cool coming up that Chris Love helped me with, so I can’t wait to get cracking with those next workflows…


Makeover Monday – Week 3, 2017:The 294 Accounts Donald Trump retweeted during the election

Donald Trumps inauguration will be on 20 January 2017. Most people have an opinion on him becoming president of the United States. Myself included. But for this week’s Makeover Monday I shall put them aside as best as I can and focus on the data.

We’re looking at his social media game, more specifically at the 294 accounts he retweeted during the Election. The data for our makeover was posted on Github.

The original visualisation was posted in an article on Buzzfeed and looks like this:


What I like about it:

  • At first glance it is simple, with marks being aligned in rows and columns in descending order according to number of retweets
  • Each circle has a label with the Twitter handle and number of retweets

What I don’t like:

  • It is simply a static image, so I cannot get any more information out of it, e.g. what tweets did he retweet? When did the retweets happen? Were they simple retweets or did Trump add his own comments to them? Who do the accounts belong to, is there more information about them? What topics caused the most retweets?
  • Choosing circles as marks makes it difficult to compare the sizes of the individual accounts listed. A square or rectangle could work better or even a bar chart, because different lengths are much easier to compare than circular surface areas.
  • The article doesn’t actually provide any caption with the chart, so there is no explanation of what we’re looking at and the reader is left to guess.

What I did as a Makeover:

  • As our data is more current than that of the original article, I couldn’t get the numbers to align, which is something I usually start with to ensure I am looking at the same data as the original author. But never mind, there was plenty of interesting stuff in the dataset to analyse
  • So I looked at tweets, retweets and the content of the tweets published to Trump’s Twitter account.
  • I wanted to look at a specific time period, i.e. all tweets since 16 June 2015 when he announced his candidacy. So I filtered all my worksheets to the period from that date.
  • I also found it interesting to compare the sources where the tweets came from. A number of articles have written about the fact that his tweets seem to come from an Android device while other, more mellow tweets are written from an iPhone. This let me further narrow down the focus of my dashboard
  • what I ended up with is a short overview of some of his tweets with some high-level stats and a line chart that shows a decrease in Trump-authored tweets and an increase in tweets from his handler.
  • For colours I picked the dark red from Trump’s website to indicate tweets written by him and a lighter shade for his handler.
  • I also wanted to use the font from his website and create an image of the heading which I used in my dashboard. The font is Montserrat.
  • I also included – for the first time ever – a mobile design, which was quick and easy to create and is something I have been wanting to do for a long time. Better late than never 😉

(Click on the image for the interactive version)



10 days in London

Just after closing out a travel-intensive 2016, I was able to start the new year with a week-long business trip to London. And thanks to a public holiday in Bavaria I managed to turn this trip into a 10 day stay in what has become one of my favourite cities.

This time I took Paul along so he could go exploring and finally visit one of the cities he had learned so much about in school and while growing up in New Zealand.

Overall it was a really good trip here and I enjoyed combining business with the opportunity to be a tourist, because I usually don’t have much time to see the places I travel to for work.

A major annoyance was the fact that my suitcase never arrived (still no sign of it) but after a bit of shopping on Oxford St I had prepared myself for a week of meetings with partners and customers as well as events I planned to attend while here.

Meeting with our customers and partners is always stimulating as we discuss how we can work together even better to make the most of the chosen technologies.

I was able to have a number of excellent discussions with people from Tableau Software and Alteryx and am excited about what’s ahead for ‘The EAT Stack’. I also got to attend VizClub

and catch up with many of my Tableau friends (something I seem to be doing every time I am here, I feel like I’m repeating myself 😉 ).


After kicking off Makeover Monday 2017 on January 2, this week saw Andy and me do a live Makeover Monday at the Data School. There were 10 of us in the room and I really enjoyed vizzing together. It’s something I hope to get many more opportunities to do in the coming weeks and months.

My traveller highlights included our visit to Westminster Abbey, seeing Phantom of the Opera, playing ping-pong at Marble Arch and running along the Thames while taking in the numerous sights on a cold winter morning. Catching up with my Deloitte buddies from many years ago was really fun, too!



So long, London. It’s been fun once again and I look forward to returning in a couple of weeks…


People who viz together…

The year is only 11 days old, but what a blast it has been so far! Since joining Andy Kriebel in running #MakeoverMonday it’s fair to say I’ve been incredibly busy. But I wouldn’t have it any other way because seeing everyone’s weekly visualisations has been fun and encouraging.It is great to see the increasing level of engagement from the Tableau community and the enthusiasm of people getting involved.

Join us – and bring a friend

Just yesterday I caught up with Andy Cotgreave who stated a little while ago that he loves seeing people’s tweets that start with “My first #MakeoverMonday submission…”. And I totally get it! Especially after – not that long ago – being reluctant to publish my own work for everyone to see.

It takes a bit of courage to put yourself out there, because you may receive feedback you weren’t quite ready for. On the other hand it is a huge opportunity for everyone to practice their analysis and data visualisation skills while growing their Tableau/professional network and developing a portfolio of visualisations which can come in handy further down the track.

You have probably also seen a number of debates this week around last week’s data and visualisations. I will provide a bit of commentary on that topic in Friday’s summary blog on the Makeover Monday website.

Getting social in the Makeover Monday community

What I want to highlight in this post, however, is the gradual emergence of little micro Tableau communities around #MakeoverMonday. Again, more on this topic will come from us on the Makeover Monday website, but I want to point out a few cool initiatives I have seen popping up and would love to hear from you if you are involved in something similar or would like to get it started wherever you are in the world…

The team around Joshua Tapley are very involved in Makeover Monday and a lot of them participated in all 52 challenges in 2016. Pooja Gandhi even scored a job in the team based on her dataviz portfolio.

On Boxing Day 2016 (yes, on Boxing Day…) 11 keen dataviz enthusiasts met in Philadelphia to do Makeover Monday live with Andy…

Of course who could forget the Makeover Monday live session in Austin at Tableau Conference?

And Andy even got up at an ungodly hour to join the Sydney TUG to talk about Makeover Monday…

Benefits of ‘vizzing together’

Just this week as I am spending a few days in London for work, I got to be part of our first ‘Live Makeover Monday’ of 2017. I headed over to The Data School and we all spent a good hour building our visualisations for week 2.

One great benefit of the session was the opportunity for everyone to ask questions along the way. This helped people avoid making mistakes with the dates (fiscal vs calendar year), identify that 2007 only contains 2 quarters and shouldn’t be used as a baseline for comparison with whole years (4 quarters) and figure out pesky formatting challenges. All while having a few good laughs along the way.

At the end of the session everyone presented their visualisation (note: whenever you do anything with Andy K, be prepared to present to the group afterwards…) and it was great to see the unique approaches everyone took and the resulting styles of visualisations. A quick upload and tweet and another 10 Makeover Monday data visualisations were submitted.


My friend, Matt Francis, hosts weekly Makeover Monday sessions at his work and provides his colleagues with an opportunity to have a ‘Tableau Doctor’ session at the same time.

And just this morning I found out about another great initiative called MM Data Camp where people get together for their Makeover Monday work.

I love hearing these stories and want to encourage you to ‘try this out at home’. Grab your favourite colleagues or data viz friends, find a meeting room, dining table or local pub, bring your laptops, order pizza and have fun with data.

As mentioned above, I will pick this idea up on our official Makeover Monday website, so stay tuned for more details.

Until then, happy vizzing and let us know what you get up to 🙂


Makeover Monday – Week 2, 2017: iPhone Sales between 2007 and 2017

Every week for #MakeoverMonday we look at a different data visualisation and provide the Tableau community with the underlying dataset so they can create their own take on it, improve the original viz, try out new techniques and tell a story with data.

Finding the data for this week’s challenge

Last week we looked at the gender gap for high income jobs in Australia and this week it was my turn to find a viz and the data for our weekly challenge. I chose iPhone sales as the topic, because exactly 10 years ago today, on the 9th of January 2007, Steve Jobs announced the iPhoneSteve Jobs announced the iPhone.

The iPhone has been immensely popular in the years since it was first released and sales figures always provided much discussion material for the media. The original visualisation appeared in an article questioning whether Apple have lost their edge as sales dropped for the first time ever in 2016 compared to previous years.


What I like about it:

  • it’s a simple bar chart that immediately shows the sales trend in the data
  • with the y-axis starting at zero the bars let me compare sales quite easily over time
  • the colour scheme is simple and appropriate for showing the annual increase in units sold in blue and the first reduction in red. This draws the reader straight to the problem
  • the bars are labelled with the total sales figure for each year, providing additional information
  • I also like the font, nice and clear
  • they list the source of their information
  • overall I appreciate that they really kept it simple compared to other charts I have seen (e.g. here, here and here)

What I don’t like:

  • where is the explanation, footnote or comment that readers would expect for the asterisk next to the 2016 header?
  • I find the title a bit odd. Have iPhone sales peaked? Clearly they didn’t in 2016 and that’s the latest data I would look at and that’s where I’m drawn to because of the red bar in the chart. My answer is “they may have peaked in 2015” but is that what the author is implying? I can only assume that they want us to focus on the drop in sales in 2016, so a heading to that effect would be useful. Maybe a simple Have iPhone sales peaked in 2015? Or Did iPhone sales peak in 2015? which sounds a bit more grammatically correct.
  • I don’t like the bevel effect on the charts. It looks very ‘Excel 2007’ and isn’t adding anything of value to the visualisation. Additionally, the red bar doesn’t have the bevel effect applied (at least not that I can see). I would prefer them to be consistent. And without the bevel, just flat and simple. Details matter.
  • I think the grid lines don’t add much value either, I would remove them as well as the y-axis because the labels on each bar should suffice
  • while we’re at it, let’s also simplify the background to a single colour: white or grey or black, but without the gradient
  • a minor point, but they could have simply suffixed the labels with an ‘m’ for millions and use the subtitle to refer to the fact that sales are in units. This would save space on the chart and leave more room for the data

What I did as a Makeover:

This week, rather than pursuing my original design idea of a viz ‘on’ an iPhone, I decided to just address my criticism of the original viz and improve what dazeinfo published.

For that purpose I simplified their viz and added tooltips and a couple of annotations for some additional information.

(Click on the images to view the interactive version on Tableau Public)

The final viz
The tooltips provide additional information for the reader

New color palettes in Tableau – viz like an artist

You have probably noticed by now that I am big fan of colors in data visualisations and rather fond of custom color palettes.

This morning I saw a retweet by Megan Hunt of color palettes from famous paintings and I knew I had to get my hands on those hex codes…

I had a bit of a play and created the script for the Tableau preferences file so I can use these palettes in my visualisations if the mood strikes.

And so that you can save yourself a bit of time, here it is, in case you’d like to use it as well. Just insert into your preferences file, save, restart Tableau and they should all appear…

Note: I didn’t create all the palettes, because some had a very limited range of colors





<color-palette name="Mona Lisa - Da Vinci" type="regular" >

<color-palette name="The girl with the pearl earring - Vermeer" type="regular" >

<color-palette name="The Rokeby Venus - Velazquez" type="regular" >

<color-palette name="The Scream - Munch" type="regular" >

<color-palette name="The Starry Night - Van Gogh" type="regular" >

<color-palette name="The Great Wave off Kanagawa - Hokusai" type="regular" >

<color-palette name="The Third of May, 1808 in Madrid - Goya" type="regular" >


One of my Tableau clients in Australia also, like many similar organisations, used Alteryx. They love it so much, they called it ‘All the Tricks’ instead.

I like that name a lot and one of my goals for 2017 is to find out many more of Alteryx’s capabilities, because so far I haven’t even really scratched the surface.

Makeover Monday admittedly is the perfect platform for me to improve my Alteryx skills because I not only have to find the data for 26 weeks of makeovers, but also sometimes clean it up, reshape it, etc.
And some datasets and stories are so intriguing that they lead me to do further analysis.

So far, my Alteryx workflows have been limited to a bit of reshaping, cleaning up data and bringing together different sources. What I really want to get my head around are the spatial tools so I can enrich datasets with geographical data.

I’ve recently had some help from Philip Riggs who helped me get a shapefile sorted out and turned into a Tableau data extract for further analysis. Thanks Philip!

There is a lot to learn for me when it comes to Alteryx, but I can see huge potential for the tool to not only improve my own data analysis and visualisation projects, but also help me share more fun datasets and some how-to guides with the wider dataviz community.

For now, I have to get back to my current Alteryx workflow, because Viz Club is happening next week and I need to get the data ready…

Tableau: picking a single color from your favorite palette

Some days you want to use lots of color, other days you just want one color. But it has to be a specific one, that special shade of blue you have saved in one of you regular color palettes in Tableau but which you don’t know the rgb or hex code of.


Yes, you could take a screenshot of it, shrink the Tableau Desktop window, hover the two next to each other and use the color picker to select the color from your screenshot and use it in your Tableau viz.

Or you can do the easy, ‘lazy’ approach I follow, which all happens inside of Tableau.

Six simple steps:

  1. Create a simple viz, e.g. a bar chart


  2. Double click in the marks card area to create a new calculated field


  3. Enter a simple string expression (e.g. ‘swim’) and hit enter


  4. Drag the newly created field (blue pill) onto color


  5. Select the color palette you’d like to use from the list of available palettes


  6. Pick your favorite color and click OK


When I posted this tip on Twitter, Rody Zakovich kindly pointed out that in order for this color to persist when publishing to Tableau Public, you will have to drag your new field into your dimensions pane and save it.

Thanks Rody for the tip!

(I tend to use this approach when doing some quick ad hoc visualisations which I don’t publish, so I hadn’t come across this issue yet)

I hope you find this useful. Happy #vizzing!