50% of Your IT is Below Average – Is Digital Transformation Even Possible??
There is a lot of interest in embracing digital transformation – I doubt if there is a Fortune 2000 company that isn’t at least investigating digital transformation. Digital transformation is complex and requires that companies take a hard look at their business models and business processes to decide how they will address the changes required for digital transformation.
Digitalization dictates changes to core business processes, not just the front end – making it hard and very risky to implement. It also dictates an unprecedented reliance on IT. Given the traditional failure rate of IT projects (you can find more details on IT project failure by McKinsey and Standish Group), the outlook for digital transformation is glum – and the historical statistics certainly don’t bode well for companies embarking on a digital transformation journey.
Many IT departments have started looking at various type of agile development and devops delivery methodologies as a way to prepare for digitalization. One issue is that most of these efforts focus on customer facing apps – and tend to ignore the business process aspects of digital transformation.
Even more worrisome for a large organisations is that both agile and devops tend to rely on a basic underpinning of high performance, highly skilled, small and relatively autonomous teams (aka ninjas) – making them hard to scale.
Automation can help bridge the gap – but we think “highly skilled” is still an issue. No matter how you look at it – especially if you are a large project oriented organization – at least 50% of your IT department’s development and delivery teams are BELOW AVERAGE. Even though it is a mathematical fact – it is very hard for IT to accept.
This bias towards “illusory superiority” isn’t unique to IT, but a general human tendency and a well documented psychological phenomenon – sometimes called the Lake Wobegon Effect or more technically the Dunning-Kruger effect. Just have a look at this classic graph of student self-assessment of their logical reasoning ability:
There are a number of issues with blindly adapting and scaling agile and devops (no matter what scaling methodology you adopt) when your IT department is at best average. That doesn’t mean agile and devops aren’t an option for large organizations – they most certainly are. However, you can’t assume every team is a group of skilled “ninjas” – but rather a group of average performers. That means that you must make sure IT implements strong governance, great processes and appropriate metrics to make sure they do agile and devops right from a business perspective and actually provide business value.
We believe there is a recipe for large organizations to adopt agile and devops. Here is our breakdown at a high level:
and from a technical perspective:
- Create a “lifelike” virtual staging environment as the continuous delivery target system
- Embrace automated testing, but understand its limitations – that will be QA job 1
- Translate requirements into user and business acceptance tests, first as manual scripts, which over time evolve into automated scripts – that is QA job 2, and will require QA closer to the business.
- Formalize acceptance through a domain specific language (DSL) common to IT and the business which is managed by Dev. The tests are owned and managed by business analysts and quality analysts (BQA?)
- Build user acceptance into the Agile process (in theory with Scrum it should already be there – but in practice it almost never is) by making user testing the primary goal of Sprints and subordinating all other selection criteria to that. We mean actual user testing – within the virtual environment from (1) – not Demos, not Product Owner testing, not QA testing and certainly not Dev doing it for themselves
- Create and measure user satisfaction KPIs (in staging and production) and business value KPIs (in production)