I attended the Analytics Institute‘s National Analytics Conference last week in the Mansion House in Dublin. It was a great event, very well attended and with an array of excellent speakers, and I give my congratulations and thanks to the brilliant Lorcan Malone and the other organisers and sponsors. On every seat at the event was a glossy copy of the National Analytics Maturity Study 2018 that the Analytics Institute had conducted with EY and UCD into the state of analytics in Ireland.

As with all surveys, there is room for interpretation but having read through the results I thought I would give my tuppence worth. Analytics is a very broad church, so I will call out my own background and bias in advance – I have been working as a consultant for 14 years, designing and implementing enterprise data warehouse and reporting solutions. Most of my experience has thus been in enabling and empowering data analysts (descriptive, explorative and predictive analysis) and data scientists (prescriptive analysis), rather than actually doing analysis or using outputs myself.

The survey was conducted with the top 100 members organisations of the Analytics Institute, so we might assume that the responses will be skewed somewhat in favour of analytics compared to the general population. The responses were arranged into sections: Framework, Metadata, Governance, Culture, Technology and People.

Section 1 – Framework

The questions in this section were based on a Business Analytics Capability Framework put together by Australian academics Cosic, Shanks & Maynard in 2012. The idea was to give the survey a firm basis by using a standard methodology with international standing, and also so that the results for Ireland could be compared with results from similar surveys elsewhere.

The first set of results was about an organisation’s confidence in its analytic capability, with separate scores for governance, value, people, technology and culture. The score ranges between 3.7 and 3.2, which for a 5-point scale in which a 3 means ‘neither agree nor disagree’ was not particularly helpful. The response showed that people don’t really know if they are capable or not – they think they are probably doing ok, but aren’t too sure.

On to pain points, people’s biggest gripes were that they hadn’t got enough money to spend (60%) or enough influence in the organisation (58%) and that not all of the SLT agreed with their point of view (28%) or supported their efforts (23%). I’d imagine every person in every department in every organisation ever feels these things to some degree, so again this wasn’t big reveal. In fact the relatively low figures here suggest that analytics is getting good support from SLTs in most cases.

On the recruitment front, it was no surprise that 4 out of 5 organisations struggled to find candidates with the right skills, and that when they did, they struggled to hire or keep them due to high salary demands (73%), limitation in available accommodation for staff, and the high level of competition for recruits.

The Role of Analytics

On the role of analytics in organisational strategy, nearly 20% said it was central and nearly 40% said it was a priority.A regular complaint amongst data analysts is that executives rely too much on intuition to make decisions rather than data and the survey reflected this, with only 60% of respondents agreeing that senior execs use data for decisions, and only 20% of these using it heavily. This means that 40% of Ireland’s Top 100 analytics organisations are not habitually using data in decision making.

40% of senior execs in Ireland’s top 100 analytics organisations don’t make regular use of data analysis for decision making.

On the operational front things are more positive, with 80% of respondents using data to change how they sell or deliver products and services, 75% using it to change how they target customers or think about customers needs and nearly 60% using to change revenue or cost structures. Overall, 78% believe analytics has an impact on competitiveness, which is great but leaves room for improvement in the 22% who aren’t yet convinced by the hype.

Of course, it may be the case that the 22% operate in business areas where analysis of available data is difficult to turn into a business advantage. In my own firm, which provides specialist professional services, we analyse performance data to ensure projects stay on track, minimise cost overruns, assess employee performance, forecast revenue, plan future assignments and to inform future estimations. All of these contribute to improved efficiency, but it not so easy to use the data to generate new sales opportunities or improve the success rate of tenders etc., which would contribute directly to growth.

Business Outcomes

While a lot of the focus in analytics media coverage is on predictive capabilities, the best and most popular use of data for organisations are still to answer the ‘what happened?‘, ‘what is happening?‘ and ‘why did it happen?‘ questions that DS, MI and BI systems have been trying to answer for the past 35 years. This is shown in the ranking of ‘targeted business outcomes’ of analytics. ‘Gain insights/react more quickly to changes‘ was listed in first place with ‘accelerate decision making‘ in fourth.

Sandwiched in between these in second and third place are the vague responses ‘increase customer satisfaction‘ and ‘increase revenue‘. It would be more useful to have variants on these questions. E.g. ‘increase customer satisfaction by identifying pain points in customer interactions’ (explorative analysis), or ‘increase satisfaction by profiling customers and offering them individualised interactions’ (prescriptive analysis). My feeling is that most organisations are doing a lot more of the former than the latter.

Success Factors

For what makes a successful analytics project, the number one factor is a clearly defined business problem. Arguably the same response phrased a little more specifically, was also number two, i.e. a clear understanding of where and how analytics can add business value. The third and fourth factors were more interesting but equally similar, both focusing on the quality of the people involved in the project.

If you give very good people very clear direction, you are more likely to get good results. The secret of success is not rocket science, even if it does involve rocket scientists!

Projects Delivering Value

Page 20 of the report listed 19 examples of projects which created significant value for respondents. Interestingly:

  • 13 leveraged traditional enterprise data warehouse, data integration and business intelligence to deliver value in the form of better management reporting, process understanding, visualisations and so on.
  • 1 project involved the migration of data to the cloud
  • 5 used analytics to profile customers, develop products, and detect fraud.

This reflects much of what I see in the market. Despite a lot of excellent effort, most firms are still in the early or middle stages of the BI maturity lifecycle. Organisations still struggle to get all of their data cleanly into one place where they can reliably use it, and as a consequence, most are making only limited forays into more advanced analytics for specific business cases.

Return on Investment

Only 11% of respondents felt that analytics had improved their organisations profit margins by 10% or more, with nearly 60% saying it had little (1-2%) or no effect. This reflects a long-standing industry-wide difficulty in proving ROI and justifying spend on analytics. It requires some mental gymnastics sometimes to continue to press the case for analytics – i.e. analytic people believe data driven decisions are better than intuitive ones, but if we are honest we must admit that often we have very little hard data to support this argument in specific situations, and so this belief itself is often largely intuitive. Hmmm.

Sections 2 and 3 – Metadata and Governance

The general data in this section about employee numbers and budget were too high-level to be really interesting. Without knowing the sector or size of the companies involved, it means little on an aggregate basis to say that 25% of firms spend between 100k and 500k on analytics. Do they have one SQL Server license on a quad core and one DBA to run it? Do they have 10 analysts on 50k each? Do they hire consultants for 3 or 4 months on a €1000 daily rate?

Likewise the fact that 16% of firms employ more than 100 analysts globally – I would like to know what kind of work are these people doing, and what percentage of the overall company workforce they are. A firm with 500 employees that employs 100 analysts to generate new products and personalise customer experience is a completely different prospect to a firm with 100,000 staff that employees 100 people to generate excel reports manually because they have never gotten around to establishing a self-service BI system.

Trends in Analytics

The next question asked respondents to rate analytics trends on their criticality to the business on a 4 point scale. 1 = not important, 2 = modest operational impact, 3 = significant opportunities, 4 = requiring a fundamental rethink of business strategy.

The really interesting thing here is that of the 12 trends evaluated, none cracked the ceiling to get to level 3. The highest was GDPR at 2.9, the veracity of data at 2.8, and larger data volumes and real-time data at 2.7. Robotics and IoT both scored a ‘meh’ with 2.0, while Blockchain scored a ‘whatever’ with a 1.5.

The hype machine is in full effect on Blockchain and IoT, but in reality most firms are still trying to implement good governance and wrestling fundamental difficulties like data integration and quality.

This is reflected in the ‘governance’ section of the survey, where only 60% of respondents reported having a good level of trust in their data. A similar percentage claimed they had consistent data definitions and standards. I would wonder if the same 60% answered positively way to both questions? After all, what is the likelihood that end users in firms without consistent standards will have high levels of trust in the data? You have to get the basics right or nothing else you do afterward will save you.

Section 4 – Culture

Moving on to culture, most respondents felt that Senior Leadership were encouraging staff to use data in decision making and that people were held accountable for the outcomes of such decisions, which should, of course, drive attention to detail and higher quality.

About a third of respondents said they had access to both good analytic datasets and the resources to use it, while another third said they had the data but not the resources. This means that two-thirds of respondents lack either the necessary data or the necessary resources to use it, or both.

Lack of data is a particular problem in the local context. The Irish population and thus market are small, and the size and diversity of data sets and the resources available to indigenous organisations reflects this. The reality is that unless you are working in a very large local firm or for a multi-national, your opportunities as an analyst to do the really interesting stuff will probably be limited.

It has been said that, aside from a few exceptions, Ireland’s whole market is scaled for Departmental rather than Enterprise analytics

The biggest challenges to consumption/adoption of analytics were the lack of the necessary skills in the people occupying the relevant roles (42%), and the failure to integrate analytics into business processes (34%). This is a common observation – we frequently see really good analytics happening, but happening in isolated silos, or happening in one-off non-repeatable exercises. A problem of similar magnitude, repeating previous responses, is a lack of faith in the data quality (31%).

Section 5 – Technology

I don’t want to be critical of an otherwise excellent report, but some of the questions in this section seemed poorly constructed and thus the answers lacked meaning. For example in the question ‘do you use a mix of big data and traditional approaches to analytics‘, is ‘big data’ supposed to mean a distributed file system like Hadoop, and ‘traditional’ to mean relational or multi-dimensional databases? Or does ‘Big Data’ mean predictive/learning techniques while ‘traditional’ means descriptive/visualisation techniques. I don’t know and they don’t say, and I imagine the people taking the survey had the same problem. Hence the results mean little.

The next question was does your organisation use parallel and/or distributed and/or cloud-based data services? These are three very different sets of technologies and respondents might answer yes if they use any of Teradata and/or DataStage Parallel and/or Hadoop and/or Azure and/or AWS and/or Informatica iPaas etc. Again, this is too vague to provide insight.

The next question suffered the same problem, has your organisation explored or adopted open source software? Explored is very different to adopted. I would think most firms have looked at MySQL or whatever at some point, but they might have backed away for the usual reasons of lack of functionality, infrequency of upgrades, the absence of warranty etc. Exploration and adoption probably shouldn’t be combined in a question.


The questions on self-service BI were clearer, with 70% of respondents affirming that their organisation had this capability and that they were adept at using data visualisation to illuminate a business issue. That still means that 30% of Ireland’s Top 100 analytics organisation don’t have self-service BI. The concept of self-service BI has been around since the early 90s and simple self-service tools are common and cheap, so this statistic stood out.

30% of Ireland’s Top firms don’t have self-service BI

Given this basic failing it was interesting to find that 73% of respondents were doing predictive analytics, (what will happen?), only slightly less than those who are doing descriptive analytics (what happened? – 76%) and slightly more than those who are implementing explanatory analytics (why did it happen? – 72%).

Predictive analytics can be sophisticated, using statistical regressions and other advanced techniques, or it can be little more than forecasting by applying trends from previous years figures to year-to-date figures. Work like this can be done in Excel by a reasonably skilled user without dedicated data analytics skills or software, so I would be wary of assuming this 72 % are all data miners.

27% of organisations say they are doing prescriptive analytics, which almost certainly requires machine learning skills and software. I don’t know if the figures cited in this result are relatively low or high, and would be interested to see the corresponding figures from similar surveys conducted in other countries. Perhaps some international comparisons might be included to enrich the context of next year’s results.

About 40% of respondents believe they have good integration between analytics systems and operational systems, which is pretty impressive I think, given the significant challenges involved in linking systems together.

70% of organisations have a data warehouse, which makes me wonder if this the same 70% cohort that has enabled self-service BI, and are doing descriptive, explanatory and predictive analytics, and have good faith in their data quality? (There is a Masters dissertation in there somewhere…)

Section 6 – People

The final section in the survey was about people, with 52% saying there was a shortage of skilled professionals, but 80% saying they had a high confidence in their organisational capabilities anyway. This reflects a view that commonly heard – that analytics is a busy/stressful job, not because of the nature of the work but because of the volume. The shortage of people means long hours are endemic. Analysts are constantly under pressure to deliver and don’t have enough time to properly reflect, discuss, develop, plan etc.

Senior management skills were less respected, with only 40% of respondents reporting that their management has the necessary skills to use analytics outputs correctly. A similar number of firms were working to provide appropriate training to non-analytic staff. This latter trend was interesting in the context of the wider conference, where several speakers described initiatives within their firms to train non-analytic staff, primarily as a response to not being able to find suitable candidates on the market.

It seems that there will continue to be strong demand for freelance contractors and consultancy staff in this area for the foreseeable future.


The survey was useful and interesting in its own right, but will really come into its own when it has been running for a few years and with more data points we can see how trends are shifting, both in Ireland and elsewhere. Some of the questions could do with being more specific and might be reconsidered in future editions.

What is very evident is that organisations need to continue to tread carefully as they progress with analytics initiatives. While a lot of the talk is about Machine Learning and AI, it seems that only 27% of Ireland’s Top 100 analytics organisations are actually doing it – i.e. 73% of firms are still at the earlier stage of figuring out what has happened and why and what is likely to happen in the future. Blockchain and IoT are likewise well down people’s to-do lists.

30% of organisations don’t yet have a data warehouse or self-service reporting. These are 1980s and 90s concepts, considered by many to be so run-of-the-mill that they are not worth talking about at analytics conferences. Can it be true that so many top firms are still stuck in the 1970s? Or are people just rushing straight past ‘go’ and not bothering with such boring old-fashioned concepts anymore?

If one message is clear in the data, it is that a well-governed and reliable data warehouse and consistent organisational data standards provide clear business value in their own right, in addition to providing a good foundation for more advanced initiatives. Ignoring these fundamentals is difficult to avoid in the long run. Of course, additional freedom needs to be given to the right people to experiment, outside of the governed constraints of a ‘system of record’, but that is what parallel ‘systems of innovation’ are for.

Finally, analytics can deliver and is delivering real business improvements and creating new business opportunities but analysts and their managers need to make more effort to capture the monetary value of these and ensure they get the credit they deserve.

Questions on Data Warehousing, Data Integration, Data Quality, Business Intelligence, Data Management or Data Governance? Click Here to begin a conversation.

John Thompson is a Managing Partner with Client Solutions Business Intelligence and Analytics Division. His primary focus for the past 15 years has been the effective design, management and optimal utilisation of large analytic data systems.

About the Author:
John Thompson | Managing Partner, Business Intelligence & Analytics at Client Solutions