Shane writes an AgileBI series called “3 AgileBI Things” published on LinkedIN Pulse. This article below is a copy of “3. AgileBI Things – 2017-03-19“.  Shane also writes on AgileBI concepts at AgileBI Guru.

1. Big Model Upfront

When delivering an AgileBI project one of the areas that always makes some people uncomfortable is the Agile modelling process we use.

The combination of BEAM and Data Vault has meant we can go from a business process lead workshop to a delivered set of Data Vault ensembles, populated with data in less than a week*.

And this is no “rapidmart” that will have to be refactored later to fit an enterprise model or to allow conformed entities, it is a data vault model that can be iteratively extended.

* this 1-week delivery doesn’t include the delivery of complex business rules, but we are working on that 😉

So this process means the enterprise data architects of old get a little upset that we aren’t spending weeks (if not months) in a room with a whiteboard defining the most beautiful enterprise data model you have ever seen. Of course in my experience, these models never survive the first engagement with real data.

Scott Ambler has written an excellent article on Big Modelling Up Front (BMUF).

Big Modeling Up Front (BMUF) Anti-Pattern

2. Data Warehouse Project Failures

The standing joke is that 50% of data warehouse projects fail. Not really a joke, if you went to McDonalds and half of the time the Big Mac you paid for didn’t actually turn up on your tray, you wouldn’t be laughing.

Ask me why they fail and I’ll give you a raft of reasons, all reasons we are trying to fix with our AgileBI approach.

Here is a great article from Tim Mitchell on reasons he sees them fail.

Why Data Warehouse Projects Fail

3. Guestimates

And why I am going down the BMUF and DW failure path, let me get started on “estimates”. The best way to get an accurate estimate is to build the thing and then see how long it took. Or to size your team, figure out how long they will work on it for and then you have a perfect estimate of effort.

Now we all know that isn’t really realistic as business owners want more surety, but the number of times I have been asked to give an “estimate” with minimal details to input (we need to load 3 source systems but have no idea how many tables or the complexity, anyone?) and then somehow be held to account when the estimate isn’t accurate.

These days I just tend to use the word “guestimate” instead and spend the minimal amount of time guessing the number, it’s a value offset approach I guess.

Tom Breur has written an interesting article on the subject. The key for this article is not so much the words Tom has written but the excellent links that are embedded all the way through the article. Make sure you go clicking!

Group processes in software estimation

About Shane 

When the team have time I am nagging them to add new cool features to ODE, our open source data vault automation engine at ode.ninja.

Most of my time is actually spent teaching and coaching customers how to deliver using AgileBI as part of the Optimal Orange team.

We run regular Agile courses with a business intelligence slant in both Wellington and Auckland, you can learn more about these here.