Cloud Computing: Navigating Through the Era of Regulation

283

The inception of big data and the cloud was aimed at making distributing applications and data easier for global banks and the financial services sector. A significant benefit was to have a single location from which to run applications and store data which would make the process cheaper, easier and consequently more efficient.

With the influx of regulations coming into fruition, the process, and consequently its desired outcome, have had to shift. As a result, businesses will need multiple environments based on country or regional requirements, or to implement a hybrid cloud approach, in which to store their data.

From a data perspective, the holistic goal of the cloud is data globalisation, where users are given access to a golden copy of data regardless of where they are located. However, in reality, due to the obstacles being imposed by many countries, data localisation is occurring instead.

Such developments have caused uncertainty around the quality and accuracy of the data which reduces its credibility, leading to questions being raised over the cloud’s ability to thrive in a sector which continues to become ever-more regulated.

Cloud Computing: Navigating Through the Era of Regulation

The Role of Personal Data

By default, data protection and privacy regulations are supposed to create tight controls on flows of personal data outside their respective countries through requirements such as data centres needing to be located inside each country.

However, this fails to recognise that the physical location of the data has no inherent impact on privacy or security. For example, if a bank is subject to European laws such as GDPR, then the privacy risks of storing Europeans’ data inside the EU are no less than those of storing it outside. The bank would still have to treat the data according to the rules of GDPR. This creates inefficiencies in technology infrastructure.

Challenge One: Artificial Intelligence

Regulations which lead to data localisation will come at a significant cost in terms of stifled innovation and productivity for global banks that are actively pursuing machine learning and artificial intelligence (AI) capabilities to boost productivity.

This is because, for machine learning and AI to be successful, organisations need access to vast amounts of data. Regulations that overly control the use of data, in effect, shackle AI. The core economic value of AI lies in its ability to automate complex processes, de-risk data environments, and increase the quality of the data output. The act of localising data will make it much harder for the banks to reap the benefits promised by AI.

Challenge Two: Fragmentation of Application 

Another challenge sparked by the influx of regulations is the fragmentation of cloud implementations.  Many new cloud-based infrastructure strategies have a very region-centric or country-specific flavour, which causes implementations to become fragmented and limits the benefits of data globalisation and cloud implementations. For example, housing data behind a firewall in a country-specific data centre creates a massive burden on the central infrastructure teams due to the significant maintenance and support costs. It is, in essence, the exact opposite of why cloud computing came about – to enable databases or applications to be set up wherever or whenever they were needed.

Regardless of where the data is physically located, the treatment of the data must go through the appropriate processes and controls and be subject to the required level of security. If all this happens, then the need for data to be stored in a particular country or region is mitigated.

A Look Ahead

Although cloud computing has been around for over a decade, public cloud onboarding has really only just begun in the financial services industry. The original implementation strategies and intended use of the cloud has already changed from the very early days and it should be anticipated that there will be significant changes to the environment in the foreseeable future. Increases to security requirements are inevitable, as is the ability to access the data in more sophisticated ways. Additionally, as regulations become more mature, there will be even more changes to how data use is monitored and measured.

There is no doubt that the use of cloud computing in financial services will continue to grow at an exponential rate. New cloud-based architectures will create efficiencies and innovations and allow firms to grow despite the influx of regulations. However, none of these efficiencies and innovations will happen unless such regulations start to align with the technology and allow for data globalisation.

Brickendon is an award-winning global management and technology consultancy, specialising in innovative solutions for the financial services industry. Winner of the Best Overall Testing Project in the Finance category at the 2016 European Software Testing Awards, Brickendon was founded in 2010 out of an idea to “step change” their clients by radically changing the consultancy model. They wanted to break the outdated consultant mould by providing real, lasting, transformational change for their clients. Brickendon consults on a range of areas, and operates with some of the largest global banks, hedge funds and asset managers. The firm is divided into five practices, focused on strategy, risk and regulations, data, quality and testing, and digital. With innovation and talent at its core, all Brickendon’s consultants hold over 10 years’ experience, solving client problems through cutting edge solutions and lateral thinking. Since inception Brickendon has more than doubled in size each year and now has offices in Canary Wharf, London, and New York.

For more information about Brickendon, please visit www.brickendon.com or follow them on Twitter, Facebook, LinkedIn and Instagram.

[1] As of June 2016