Solvency 2 – Lessons Learnt (Part 1)

I haven’t posted in a while; partly from a lack of inspiration but mostly from the fact that I’ve been eye-ball deep in Solvency 2 data related activity. However I thought it would be good to try and get back into the habit of posting as I find it helps my own thinking as well as being a useful vehicle to garner thoughts and comments from others. We’ll see how I get on.

There has been a lot of European debate around Solvency 2 and the timetables related to implementation of the directive and I don’t want to go into all the ins and outs but the simple fact is that it feels as though momentum has been lost somewhat. Especially given the fact of where we are in relation to the original implementation timescales. A lot of firms seem to be taking stock of where they find themselves and are using the time look back at where they’ve come from as well as reassessing where they are heading. I thought I’d do something similar and take a reflective look at some of the lessons (from a data management perspective) we can already take from the Solvency 2 journey that can be applied to any project. Part one looks at the ever important business case:- Continue reading

Solvency 2 FSA Data Audit for IMAP

I haven’t written a post about Solvency 2 (or Solvency II if you prefer) for a while and given it is still my primary focus with my current client I thought it was about time I gave a quick update.

For those of you who aren’t aware, most UK-based insurance firms (especially those applying to use their own internal models) will currently be concentrating on the FSA Solvency 2 ‘Data Audit’ and whilst they will all be at various stages in the process, they will be focusing on similar pieces of work and experiencing similar challenges. The ‘Data Audit’ is effectively a review requested by the FSA in which each firm is expected to undertake an independent audit (internal, external or a mixture of both) of their data management practices. The findings of this audit will subsequently form part of the FSA’s Internal Model Approval Process (IMAP) and help the FSA in its assessment of whether a firm is compliant with the standards for data as set out in the Solvency 2 directive. The scope of the review has been defined as all data (both internal and external) that could materially impact the Internal Model. Continue reading

DIY Data Management

I’ve just been reading Phil Simon’s Data Roundtable post, ‘Excel, Office 15 and Big Data’, in which he ponders a few questions regarding Microsoft’s updating of their most popular software offerings. Towards the end of the post Phil asks a question that sent shivers down my spine as I, like him, already knew the answer – “Is there a CIO right now who won’t invest in a sufficiently powerful application because he or she thinks that Excel will be able to handle [the management of ‘Big Data’]?”

Unfortunately the answer is yes. The bigger problem though is that it’s not just CIOs who think like this. Although they may hold the purse strings when it comes to application investment, they can be swayed as they look to their trusted advisors for guidance. But if the general opinion that Excel or some other EUC solution is ‘probably good enough’ then someone trying to push through a more strategic and robust solution could be facing an impossible task.

People are becoming more technologically savvy and as a result a new wide-spread confidence in data related tasks can be seen. In some ways this is a good thing as a better general awareness of data related issues can aide in the implementation of governance and quality initiatives. However, it can also mean that business areas are much more likely to  bypass the traditional IT development processes in order to implement their own EUC solution and especially so if they have been previously burned by IT. A new breed of DIY data solutions rise up in silos across an organisation thanks in part to the ease at which tools like Excel make basic data management accessible to the masses.

What this all means is that a strong and well implemented Data Governance programme is needed more than ever to ensure that appropriate controls, processes, people and technology are in place. EUC solutions most definitely have their place, but they must be used and controlled appropriately and be covered by the same governance as any other technology within the Enterprise or data management issues can proliferate and escalate rapidly.

Destination Unknown

Photo of a train

There was an excellent discussion over at Henrik Liliendahl Sørensen’s ( @hlsdk ) blog recently where a debate opened up over whether the concept of data quality should be best likened to a journey or a destination. The more commonly recognised metaphor is that data quality is indeed a journey, however John Owens ( @JohnIMM ) argued that, in fact, data quality is actually a destination. His point being that in considering it an endless journey would in fact be akin to giving “DQ practitioners, and their grandchildren, a job for life”. He stressed that “Quality data should be created as an integral part of doing business day-to-day”. Continue reading

Aiming for Minimum

Working within the financial sector for the last eight years has meant that I have been involved in a number of projects dealing with regulatory imperatives. Something that has always infuriated me is the desire for many organisations to deliver only the minimum necessary to achieve compliance. Don’t get me wrong I understand the dangers associated with aspirational plans and the over-engineering of ‘gold-plated’ IT developments, but the desire for minimum can be even more dangerous. Continue reading