The Big Guess Up Front (BGUF)

by | 3.4 - AgileBI Process, AgileBI, Concepts, Done Done

As a Stakeholder or Product Owner
I want to understand where the traditional planning documents fit in the AgilebI process
So that I know when I need to have them created

The way we used to deliver Business Intelligence projects was to do what I call the “big guess up front” (BGUF).

Documents for Africa

We would create massive Business Intelligence Strategy documents, which took months to create and get “signed off” at a senior level in the organisation, often by the Board. This strategy document then became set in stone, and it could never change, even if every assumption it was based upon was found to be invalid or had changed.

We would define a project plan early with a standard list of tasks and dependencies based on other projects we had estimated, and then guess the effort required to deliver these tasks and the milestone dates we might be able to achieve. These got signed off as the official plan before we ever gathered the details we needed to see if these estimates were reasonable or achievable.

We used to spend months acquiring and documenting requirements, which became a massive wishlist of every piece of data, every BI tool feature and function the users could think of as it was their one chance to get this capability delivered by a project.

We would spend months standing at whiteboards and stuck gathered around screens with Microsoft Visio defining a single enterprise data warehouse model to rule them all. Or worse leave it to a single data architect to huddle over their screen on their own for weeks, weaving their modelling magic.

The testers would build test plans in infinite detail, with spreadsheets of the hundreds if not thousands of the detailed test they would run to prove this thing we would build would match the things we wrote down.

Intro the Change Management Process

And of course, this was all based on the assumption nothing would change, or if it did change a “change request” would be raised and the “change process” would ensure all the documents and all the people would change in fluid unison. It never worked.

So at some stage, problems were escalated and a “Change Manager” would be added to the team. The Change Manager was not there to help the users transition through the changes that were required to successfully adopt this new capability. This unfortunate person was there to “manage” the changes within the project and would be akin to either the plumber or a police officer managing traffic at a roundabout.

If, somehow, the Change Managers role was deemed to involve fixing issues when they are raised, I liken their role to that of a plumber. Anybody that had an issue would handball it to the Change Manager and expect them to resolve it. If there was something messy to deal with, that you didn’t want to deal with yourself, then you called in the ‘plumber’. Of course, the Change Manager was normally just a team of one with minimal authority and budget to make things happen. Eventually, the Change Manager would become swamped with the raft of issues they now had to deal with, causing a massive blockage.

If the Change Manager’s role was defined as being the channel that managed any change issues but wasn’t responsible for actually resolving them, then I liken the role to that of a police officer directing traffic at a roundabout. If you had an issue, you couldn’t (or wouldn’t) resolve you put it into the roundabout, and the Change Manager made sure it got off the roundabout at the stop of the person or team who could resolve it. In this scenario you often also made sure that you blocked your exit lane on the roundabout as effectively as possible, to ensure the Change Manager couldn’t offload any new issues to you or your team. You did everything you could to stop these problems being allocated to your team due to your team being flat out trying to deliver to the scope in the estimated timeframe. Eventually, the roundabout would become clogged, and there would be a moratorium on any new issues.  As issues still eventuate even when a moratorium for managing them exists, there was often a concept of a “parking lot” where they were placed, so they were supposedly visible but with no chance of them ever getting cleared.

Of course, all this change is bound with documents, templates and processes that are meant to streamline the management of this change, but in fact often makes it too onerous to raise any change issues at all.  At some stage, a “Change Administrator” would be added to the team to manage the process,  track and report the current status of each change.

In some organisations, I have seen projects terminated when they ran out of time and money, based on the initial project plan BGUF, regardless if the core tasks are completed or if the outputs had successfully passed full testing.

And the project was called a success.

The project closure left a massive amount of work for the Business as Usual (BAU) team to complete as soon as the environment went live.  These BAU teams were not given additional funding or resources to undertake this work, in fact, often there was an expectation the team headcount would be reduced based on the efficiencies this new system promised to deliver.

These BAU teams then had to struggle to manage the raft of calls they got from the introduction of a new system, as well as find the time to finish the tasks left over from the now closed project, all the while maintaining their previous day jobs

It’s not a Team Sport

One of the major outcomes of these big guesses upfront are a lack of accountability for the team that receives them. They are handed documents and designs that they didn’t create and told to implement them. The context, discussions and insights that went into these artifacts have been lost as soon as they were written down and handed over, making their task difficult if not impossible.

Also, the change management process removes their accountability for resolving any issues that they identify. Any issues that are raised are typically not in their sphere of control to fix.

And this is why it fails, it’s a bunch of individuals or teams working in isolation.

I have encountered project plans that were created by Project Managers who had never worked on a business intelligence or data warehousing project before.

They would use a previous plan as the basis for the new development with no multiplier for the level of complexity difference between the two projects.  For example, in one organisation the Project Manager scoped out the estimate to add a new source system to the current data warehouse.

Unfortunately, the new source system had some large text columns that needed to be extensively parsed to meet the business requirements, the previous source system on which the estimates were based did not. The development team was not involved in the estimates, and the budget was locked in based on the initial plan.

The team was setup to fail.

It’s not the People it’s the Approach

The problem is not that the Change Manager is incapable, Project Managers are trying not to scope properly, or the delivery teams don’t want to be accountable for success, the problem is we are doing big guesses upfront. And this causes some issues:

  • We create strategies, documents, designs, processes and roles that cannot easily be changed when required
  • The documents are based on a large number of assumptions (guesses) as there is a raft of things we don’t know yet
  • We spend far too much time creating these when we could be using that time and resource to deliver value upfront
  • We communicate via documents not ongoing conversations
  • We restrict the teams to deliver small chunks of the entire outcome
  • We start with a culture that change is bad and should be avoided at all cost
  • We somehow believe that creating these big guesses upfront reduce risks during the delivery phase, when in fact it increases them.

AgileBI is an approach which is all about providing value continuously, collaborating on the right thing at the right time, and ensuring we can manage change at anytime. Big guesses up front make this very very difficult.

AgileBI Process Articles

Qlik Sense Desktop – Loading QVD

When using the Qlik Sense Desktop you can easily add Excel or CSV files to your new application by selecting them in the add data window. What I missed was the fact that this option also allows you to add a QVD easily as the source file.  Just select "All Files" and...

Trifacta – Wrangling Dates (and yet another open data rant)

One of the pieces of data we use regularly is dates. Every data repository needs the ability to view the data by day, month, year, etc. There is often also a need to view data based on holidays. Funnily enough, you would think that a list of data containing the NZ...

Guided vs Discovery vs Blending Information Products

I have seen a massive change in the business intelligence, data and analytics tools our customers are buying and using over the last couple of years. The advent of data wrangling tools such as Trifacta, open-source analytics tools such as R and visual exploration...

AgileBI Delivery

As a BI Practitioner
I want to understand how to deliver in an AgileBI way
So that I can deliver faster and with less risk

AgileBI Team

As a BI Practitioner I want to understand what an AgileBI team is and isn't So that I know how to form and storm one Building a self organising team is one of the key factors required to deliver in an AgileBI way. Creating a team with T-Skills and helping...

AgileBI Discipline

As a BI Practitioner
I want to understand what AgileBI Disciplines are
So that I know what Disciplines to follow (and what Disciplines not to)

Evaluating BI tools – Don’t get left hanging

Evaluating BI tools used to be easy. You either took a stack approach and picked a vendor first, then selected the modules you wanted to use from their product catalog. Or you picked one thing you wanted to do extremely well and selected a best of breed product/vendor...

Hans Hultgren – The 2016 NZ Data Vault Certification Tour

As you may have worked out if you have been reading our blogs for a while, we are big fans of the Data Vault approach as a way of automating the Integration layer of a Data Warehouse (DW). In 2014, we sent a couple of our DW experts to Sydney to attend Hans...