I was lucky enough to work with Dr Gerald J Wong earlier this year. He is the Data Strategy and Governance Lead at the UK Hydrographic Office (UKHO), which is a world-leading centre for hydrography and an executive agency of the Ministry of Defence (MoD). The UKHO specialises in marine geospatial data that helps others to unlock a deeper understanding of the world’s oceans. This data is shared with governments, defence users and academia, as well as available through their portfolio of ADMIRALTY Maritime Data Solutions.
Originally specialising in Nuclear Physics and Optical Engineering whilst in academia, Gerald joined BAE Systems Avionics (now Leonardo SpA) to invent and patent sensor technologies. After diversifying with an MBA from the Edinburgh Business School, Gerald then moved to the International Defence arm of the UK Meteorological Office. There he supported weather impact predictions for remote sensors and Big Data issues for UKMO partners such as NATO SHAPE (Supreme HQ Allied Powers Europe) and several national Air Forces throughout Western Europe. Following five years at the Met Office, Gerald transferred into the UKHO to support their transformation from paper charting towards modern on-demand digital services, including Marine Spatial Data Infrastructure (MSDI).
How long have you been working in Data Governance?
I have formally been evolving Data Governance (DG) at the UKHO for over two years. Prior to that at the Met Office I handled ‘Big Data’ and associated Governance issues for five years. As a result, I have been operating within the DG space for far longer than my present role title suggests, and I suspect that many readers would be also able to credit themselves with much more DG experience than traditional or conventional “job histories” usually imply.
Some people view Data Governance as an unusual career choice, would you mind sharing how you got into this area of work?
My journey into Data Governance was a gradual evolution from starting as an end-user of “simple data” during my early Physics and Engineering roles in a closed-loop environment (experiments with clear start, end and/or reset points). This evolved via “richer data” forming a crucial input into decision-making analyses around weather impacts for sophisticated, but well-defined static scenarios, which started to include the need for Data Governance. The final step was moving upwards to formal Data Governance within a dynamic ecosystem of complex real-world dependencies and feedback loops, namely the oceans and human activity above and below the waves, which is dependent upon the physical environment, yet also can affect the physical environment, leading to future changes in human decision-making, and so forth.
This natural evolution tracked my career development from roles with constrained remits – laboratory experiments – to roles that included increasing needs to consider human (mis)behaviour around data and technology, which also includes how to practically integrate data and information to support real-world, socio-economic decision-making.
This evolution closely mirrors the typical hierarchies of corporations and institutions, from the end-user Tactical level of ‘how’ to do something with data, the middle-management operational level of ‘what’ to do with data, and finally the Thought Leadership level of ‘why’ to adopt a certain business strategy for data in the first place. Hence in today’s information economy with increasing adoption of Artificial Intelligence, there is a rapidly growing need for competency and experience in Data Governance – whether that be within marine geospatial data, cyber technologies, green manufacturing, logistical supply chains or retail customer sales patterns.
What characteristics do you have that make you successful at Data Governance and why?
One crucial characteristic is a healthy scepticism and a drive to improve ineffective practices, especially where they’ve become entrenched as tradition, convention or the “way it’s always been done here”. I like to counter such perceptions within organisations, particularly those that genuinely want to evolve, with the view that “if you always do what you’ve always done, you’ll always get what you’ve always gotten”. Long-term existing practices evolved in the past to meet some requirement at that time in that environment and may have once satisfied a need very effectively, but the problem is stagnation while the market and competitors have moved forward.
Another important trait is avoiding unwarranted change for its own sake, as the mirror opposite of static tradition or convention, but this time as the modern trend of “continuous disruptive change without strategy”. This type of “burn it all down” or wrecking-ball approach to Data Governance omits that many long-term practices can still be effective and that change needs to be incremental, integrated and monitored – not only with corporate structures but also human behaviour, means, motivation and opportunity (often the true critical factor). Adapting, modifying, and repurposing established policies or existing processes can help preserve “change capital” for those changes that are genuinely novel or necessarily disruptive. It can also mitigate frictions with those invested in existing practices, such as their users, instigators, designers, and owners; instead bringing them onboard and engaging them with the repurposing and updating.
The third characteristic in a triangle of ideal traits with the other two, is to have a keen applied interest in human behaviour around the use (and misuse) of data or information. Traditional or conventional “Hard Governance” centres around the assumption that people only make the wrong decisions because they have the wrong information or not enough of it. Hence the traditional view of Data Governance coalesces onto hard compliance measures and management surveillance, which includes formal audits, regular in-depth reporting, restrictive checklists, with a focus on top-down, non-negotiable command and control. This approach was suited to traditional mass manufacturing of standardised products but is insufficient by itself for modern data services that are digital-first by design and characterised by near real-time changes.
Soft Governance works with the grain of human behaviour to achieve better results by enablement and empowerment, rather than by command and control alone – principles take precedent over prescription, thus allowing an organisation to leverage the deep insights and frontline experiences of their entire workforce. Shortcut thinking, lack of active engagement and wrong assumptions are some of the key targets for a Soft Governance approach, which still always requires the ultimate backstop of Hard Governance – but meaningfully targeted and monitored using a risk-based approach. Combining the two approaches can yield outsized and transformative results.
Finally, some supporting characteristics to boost the Big Three above include being able to transcend organisational hierarchies, stovepipes and functional siloes. It is crucial to not bury Data Governance within your Data, Digital or Technology domain but to reach out, persuade, influence, and engage far wider afield – especially with customer-facing or revenue-generating areas. The mission is to demonstrate that Data Governance is not merely a cost centre to meet a required need at a minimum level, which is the traditional, outdated viewpoint, but is a key investment in an external marketable strength that can grow business opportunities. Governmental, private and industry users of digital information services are increasingly keen to partner only with trusted providers whose Governance they can have evidenced confidence in for the assured data they consume.
Are there any particular books or resources that you would recommend as useful support for those starting out in Data Governance?
When starting a journey within Data Governance, the main problem with resources is the sheer proliferation of information! The key step for any aspiring learner is to self-govern their own reading by always keeping in mind that “bigger picture” Data Governance is commonly conflated with the technical details of Data Management. Though these fields are clearly interdependent to some extent, this conflation can happen even within respectable publications, so critical thinking is needed by those starting out in DG.
The following three books are my recommendations for building a firm foundation in Data Governance, supplemented by the insights and experience from whichever business sector they operate within. Both the second and third recommendations may be surprising to those expecting technical tomes or lengthy academic textbooks. They are both inspiring reads and essential prompts for thinking differently about DG to unlock progress that is not shackled by outdated assumptions, mainly that people are automatons of a sort and behave in entirely predictable, logical ways around information.
“Non-Invasive Data Governance: The Path of Least Resistance and Greatest Success” by Robert S Seiner is my first recommendation and is a compact, accessible book when compared to more formal textbooks, which can be intimidating and hard to apply for some. Using clear language, memorable quotes and supportive graphics, the book gives an excellent grounding in modern Data Governance, emphasising the value in a low-resistance approach by repurposing existing corporate structures and artefacts.
“Thinking Fast, Thinking Slow” by Professor Daniel Kahneman is renowned within its field with the author’s underlying research into Behavioural Psychology earning him the 2002 Noble Prize in Economics, by evidencing the existence of cognitive biases within people’s behaviour. Cognitive biases are systematic deviations from rational behaviour that might have served humanity in the past (“Thinking Fast”), but now can interfere with rational decision-making in the modern world (“Thinking Slow”). Confirmation bias is one of the best-known examples, but there many more that can subtly exert their influence, even over professionals and experts. These can all cause real-world effects, including injury and loss of life, especially in safety-critical ‘outlier’ situations under time pressure and uncertainty. It is a relatively long and engaging read, but each chapter is self-contained to an extent with excellent opening quotes and memorable takeaways to encourage recall.
“Inside the Nudge Unit” by (now) Professor David Halpern is an excellent follow-on from the previous suggestion, however this time showing the application of Behavioural Governance within a real-world Governmental setting. Halpern is the CEO of the Behavioural Insights Team that was instituted in 2010 by UK Cabinet Office to directly support Government efforts to create outsized effects with relatively small changes of the right type. By giving case studies and real-world examples with their outcomes, this book can inspire readers to begin considering what nudges they can instigate to encourage their existing Data Practitioners to become active and engaged “Data Citizens”. This is needed for modern DG as risk-adverse Hard Governance is akin to “The Law” that commands people what to do or not under specified circumstances. It cannot detail every possible set of circumstances and doesn’t inform how to go above and beyond to create a “Data Community”, which exploits opportunity in new circumstances and requires risk-informed value-judgements. This is ideally achieved by Soft Governance to empower those on the frontline with their wealth of both experience and insight via principles and guidelines, with the backstop of traditional Hard Governance to formally manage the most common and significant risks.