A fresh look at cyber practices can make high-value assets more secure

09 May 2018 Consultancy.eu

Data breaches happen when organisations fail at fundamental data protection practices. A fresh look at those practices can make your organisation and your high-value assets more secure.

Consider for a moment some of the most significant data breaches of the recent past:

  • More than 140 million customer records exfiltrated from a leading credit reporting agency, exposing highly valuable personally identifiable data, such as Social Security numbers, dates of birth and driver’s license information.
  • Half a billion user accounts compromised at a leading Internet service provider, revealing names, e-mail addresses, telephone numbers, dates of birth, password information and more.
  • 80 million patient and employee records breached at a health insurer, potentially exposing names, dates of birth, Social Security numbers, e-mail addresses, employment information and income data.
  • More than 50 million credit card accounts compromised at one leading retailer, and more than 40 million at another.

The list goes on. But when you take a step back to assess what these breaches have in common, you reach an inescapable conclusion: the numbers would be on a less staggering scale if the organisations involved had effectively practiced the basics of data-centric security.

Let’s start with the obvious. Data breaches of the scale in the examples cited are incredibly costly. Estimates put financial losses from a severe event into the tens or even hundreds of millions of dollars. Add on to that damage to brand and reputation, and ongoing financial and legal exposure. The pain can be immense and long lasting, to both the victimised organisations and their partners and customers. Even in everyday breaches of more manageable scale, the financial and reputational damage takes a toll; research by the Ponémon Institute sponsored by Accenture estimates the cost of cybercrime to the average organisation has increased by nearly 23% in the last year to $11.7 million.

Data breaches can be incredibly costly and run into tens or even hundreds of millions of €’s

A related similarity is that organisations victimised by breaches have not fully appreciated the value of data as the lifeblood of business. In the intelligence community, loss of data means loss of life. Hence there is an absolutely urgent focus on protecting data to save lives. In business, losing data may also cost lives in sectors like energy, chemicals and healthcare, but it is currently more likely to lead to competitive disadvantage, damage to brand and reputation, and significant legal and financial consequences. Business runs and depends on the secure processing of data, and protecting data deserves a commensurate level of attention, respect and investment. In the digital era, data is value. Those who guard that value have significant advantage over those who do not.

The third characteristic shared by organisations victimised by breaches is multiple points of failure. The issue is not whether criminal attackers exploited a known website vulnerability the victim organisation failed to patch, or instead launched a zero-day attack. The issue is that multiple processes and procedures had to fail for tens of millions, or hundreds of millions, of customer records to be exfiltrated, and for that exfiltration to go undetected for days, weeks or months.

Then there is also the unexpected disrupter, the proverbial dark horse that is legislation. With new legislation such as GDPR coming into effect, it has become vital to understand what data you have, where it is, and how it is being processed. In this case, not just to put a tick in an audit box, but to be able to demonstrate to the regulators how you are effectively managing the data that you are a custodian of at all times.

All of which adds up to straightforward, prescriptive advice: Organisations need to put their data protection fundamentals in order. To fend off and minimise the impact of data breaches, they need to “harden” their data assets and be brilliant at practicing data-centric security basics. All this, next to adhering to other good security practices of course.

1. Identify and harden your high-value assets
These are your “jewels”, the data most critical to your operations, subject to the most stringent regulatory penalties, and most important to your trade secrets and differentiation in the market. “Hardening” a high-value asset means, making it as difficult and costly as possible for adversaries to achieve their goals, and limiting the damage they can cause if they do obtain access. Some added guidelines:

  • Adopt the attacker’s mind-set. What do they
    want most? Design and execute your threat and vulnerability program, and overall security solution, to deny it.
  • Consider and use multiple techniques including encryption, tokenisation, micro-segmentation, privilege and digital rights management, selective redaction, and data scrambling.
  • If your high-value assets are on legacy systems, do not try to harden those assets all at once. Instead, add additional protection and increase visibility over control points or points of access until you migrate or modernise the legacy systems. If you have legacy systems that cannot be suitably hardened, look for opportunities to restrict access and up-level your monitoring. Be laser-focused on timely detection at your weakest links.
  • Remember that with all the focus on securing data, encrypting it, keeping it in the safest of systems, if the same controls are not applied to people who have access to the data, you have simply moved the point of failure. To fully protect your high-value assets, it is critical to keep “the people dimension” in mind.

A fresh look at cyber practices can make high-value assets more secure

2. Build up your defenses through network enclaves both on-premises and in the cloud
The perimeter is no longer the perimeter, it has become too easy for adversaries to breach. And the enterprise that the perimeter is intended to protect now extends well beyond “the four walls” to the cloud and the field and the control rooms. Consider creating enclaves, environments both on- and off-premises where you can better monitor the comings and goings of users and the behavior of applications which limit an attacker’s maneuverability. When the perimeter is breached, the enclaves remain safe. Think of a ship, if the hull is breached, hard partitions in the compartments underneath will prevent the ship from sinking. In the same way, hard-partitioned enclaves in your network prevent a breach from moving laterally through the entire enterprise.

3. Build and execute a hunting program
There was a time when organisations felt they only had to activate their incident response plans in the event of a breach. Not any longer. Today, the best approach is to adopt a continuous response model, always assume you have been breached, and use your incident response and threat hunting teams to always look for the next breach (“find them before they find you”).

4. Catastrophe scenarios
Develop, run and test scenarios that simulate business catastrophes, for end-to-end effectiveness, so that you can verify and validate that you can detect an adversary, and that your people are prepared and ready.

5. Map your environment
Create an understanding of your data landscape by identifying the business applications, processes, information usage patterns, systems and platforms in the environment, their business value and associated risks. Understand the flow of information within and outside your organisation and communication channels that they follow. Identify the different data repositories and the respective asset owners. Knowing all of this means you know exactly where to exert time and energy on protecting your data.

6. Limit, monitor and segment access
Use two-factor authentication as much as possible, and use role-based access to make automated decisions about who is allowed to see what data and systems. Move toward micro-segmentation in your access control, recognising that when sensitive data needs to be adjudicated by different people for different reasons, none may need to see the data in totality. Micro-segmentation can show each person what he or she needs to see based on his or her roles and responsibilities, while obscuring the rest. This also limits damage in the event of a breach if any one user’s credentials are compromised, only a portion of the data is exposed. To exfiltrate whole objects or larger swaths of data, the adversary’s job becomes much more difficult.

Being brilliant at the basics can help organisations fend off breaches

7. Monitor for anomalous and suspicious activity
Monitor continuously and vigilantly not just for unauthorised access but also for undiscovered threats and suspicious user behavior.

8. Develop both strategic and tactical threat intelligence
Have a sustainable threat intelligence program that collects and curates both strategic and tactical threat intelligence. Strategic threat intelligence is human intelligence coming from a variety of both closed and open sources for example, an e-mail explaining that certain versions of Apache Struts are vulnerable to attack, and how that vulnerability is exploited. Other forms of strategic intelligence can provide insights on campaigns targeting certain industries or technologies, or geo-political trends that could change the incentives of attackers. Tactical threat intelligence includes machine indicators of compromise that feed in automatically to your systems for example, an automatic feed from Palo Alto Networks or Qualys directly into your tooling. Stay as current as possible on both the broader threat landscape and the specific threats posed by adversaries as they relate to your organisation.

9. Build a security ecosystem
No organisation is an island. Supplement internal talent and skills with a diverse vendor support system. When necessary and appropriate, take advantage of the assistance that managed services organisations can deliver.

10. Prepare for the worst
Transform your incident response plan into a crisis management plan that can be enacted if the worst-case scenario materialises. Make sure legal and corporate communications teams are on “stand by” and prepared to take action. Exercise the plan so that the business builds the muscle memory and identifies areas for improvement before the next issue arises. Be ready for a catastrophic cyberattack where e-mail, voice-over IP, and other communication systems used on a day-to-day basis are unavailable. For such catastrophic emergencies, consider storing critical contact information in the cloud and being prepared to use the cloud as a secondary out-of-band platform for e-mail and voice communication.

Conclusion

Any organisation intent on avoiding serious data breaches owes it to itself to review how well it is putting the fundamentals of data-centric security into practice. Closing any gaps will help fend off breaches and minimise their impact.

An article from Jaco Jacobs and Kimberley Zwaart, both consultants at Accenture.

Profile

Governments can reap major benefits from the data economy

01 April 2019 Consultancy.eu

In today’s digitised economy, data is everywhere. This is allowing companies of all sizes to use data science techniques such as data analytics, artificial intelligence and machine learning in order to gain a competitive edge. Likewise, the public sector can also reap major benefits from data – Zoltan Tanács, a partner at Horváth & Partners, reflects on how governments can thrive in the new era of ‘dataism’.

In the brave new world of digital disruption, big data and emerging AI, traditional values, beliefs, worldviews and even religions are changing and there is a new, emerging “religion” out there called “Dataism”. According to that, if you have the data you will be able to understand and manage the world around you. Is government a possible domain for “dataists”?

Definitely, governments have no other choice. The power of a ruler has always been secured by his information/data processing capabilities. In ancient times, the most powerful man was the one with the most social connections and best allies in his tribe. In medieval times, a well-organised kingdom with centralised administrative processes could outperform less organised ones. In the modern history, liberal democracy also proves its merits in advanced information processing capabilities through distributed and transparent information exchange between the government and the constituencies.

We don’t know yet, what kind of democracy or other governance models will be the winner of the future – but it must have good and continuously improved data processing capability. Otherwise, the real owner of the data like Google or Amazon will soon make governments irrelevant.

Data is an aspect, but not the holy grail. At all times it was not only the facts (was it called data in the Middle Ages?) that led to decisions and was used for the good of the people. Manipulation is as old as mankind. And nowadays the technology out there is so powerful that we need to control that. The best of data will not help if in the wrong hands.

What kind of data is important for governments?

Well, a nation’s data asset covers a very broad set of data. It consists of Public Sector Information (PSI) and non-governmental data as well. Both can consist of personal data (e.g. names, addresses, personal IDs) or non-personal data (e.g. statistics or business data). Part of the nation’s data asset belongs to its citizens, this is their personal data.Governments can reap major benefits from the data economyIt is easy to imagine the basic, traditional data types like headcounts, social insurance IDs, GDP numbers. Nowadays, this legal data and economical statistics dominate our understanding of data, and even now governments struggle to utilise this kind of data. However, in the future, governmental data management will go through a radical change. Data quantity will explode, quality will dramatically change, and the question of data ownership will be critical. New types of data will emerge, like health/medical data delivered through biometric sensors operating 24/7 or data about human behaviour measured through advanced camera systems with face recognition. Those governments, which can utilise these new types of data, will gain competitive advantage.

What are the best practices for government data asset management?

China, for example, is testing a so-called “social credit system” based on a continuous measurement of its citizens’ financial, social, moral and political behaviours. For instance, “good citizens” in this system would be the one who do not have negative financial credit records, who take care of the senior family members, who are a blood donor, etc. For them, they may get better credit conditions from the banks, privileges in social benefits such as housing and hospital treatment. It is an advanced way of using data, but also an intimidating way of using the citizens’ private information. Is this a best practice? Maybe George Orwell could tell, if this 2019 is his “1984”.

I am convinced that all governments should develop their data management strategy. They should define what kind of data they have and want to have, how they want to store, transform and utilise these data. Governments should consider how far they could open non-personal data. Research has shown that open data policy supports business activities and can improve the economic competitiveness of a country.

Regarding personal data, governments and citizens have to come to an agreement on how far citizens are willing to give away their own personal data for governments to realise the advantages of a centralised, nationwide data ecosystem.

On personal data and privacy, the European Union has a new legislation, the General Data Protection Regulation (GDPR). How would you evaluate the first experiences with the new GDPR regulation?

The protection of personal data is getting more and more important. The Cambridge Analytica scandal of Facebook has shown that the USA, in the future, will have to consider some kind of privacy regulation. GDPR is a big step forward in protecting online privacy – but also a big competitive disadvantage for the EU compared to China or the USA, where much looser legislation exists.

Our first experiences show that companies pay much more attention to privacy issues than before. They have started and implemented big GDPR projects to comply with the new law. The sensitivity of customers improves regarding their own privacy. Still regulation is in many cases just behind real life. It hinders in many cases business and puts many administrative burdens on the normal operation. There will be a new, emerging segment of consultancies and lawyers, who specialise on privacy issues.

“Governments should define their dataism path – they have no other choice – and embrace action.” 
– Zoltan Tanács, a partner at Horváth & Partners

GDPR is about ensuring high standards for privacy – but it does not cover all aspects of data security. How do you see the importance of data security topics in government?

Information security will get on the top of the agenda of government CIO’s in the coming years. Compared to the number and causalities of “traditional” armed conflicts, the number of cyberattacks is increasing rapidly. The example of the past US presidential elections showed that cyberattacks can influence the global political world. Next to traditional tools of cybersecurity (building redundant IT systems, applying latest firewalls, antivirus systems, encryption tools, biometric identification etc.) artificial intelligence will gain on importance in detecting and preventing online attacks.

Let us imagine that a government has a solid data management strategy and is able to implement it and ensure the necessary level of security as well. What can be the benefits for the government and citizens?

Increased competitiveness for the country and a better life for the citizens by decreasing administrative work, as well as better, cheaper public services. And of course more effective political decision-making. If we know more about health, trade, traffic, crimes etc. we can have systems give us accurate scenarios and options for measures and achievable impacts. Less talking about personal estimations but more fact-based decision-making.

However, this improvement has also its price: we have to share an increasing amount of our own personal data with the government if we want to enjoy these benefits. That is not possible without trusting the government.

How will “dataism” shape the future of successful government models? What do you think will be the winning government model of 2050?

I truly believe that governments of today face a big challenge that threatens their very existence. The governance capabilities of data could directly affect the decision-makers’ ruling power. Governments seem to lag in this race. If governments do not speed up, tech giants like Facebook, Google, Apple or Amazon might challenge the government’s ruling capabilities. Also, if the government cannot regulate such companies on how to use their data properly, it fails to protect citizens’ basic civil right. The big question is how to protect and own national data in a global digital world.

Governments still have the political power to do so – at least for another couple of decades. Nevertheless, what kind of governments will be more successful in this? I think the traditional model of liberal democracy will transform into something new.

Yuval Noah Harari, the famous author of best sellers ‘Sapiens’ and ‘Homo Deus’ describes the first model in his latest book ‘21 Lessons for the 21st Century’ as the “digital dictatorship”. Imagine a state, where all citizens are monitored 24/7 and not just in the ways we know today, by using cameras or checking phone calls or emails, but using wearable biometric sensors and by advanced cameras that measure blood pressure, heartbeats, emotions, even thoughts. Technically part of this is already possible today or will be possible soon. In this digital dictatorship example, the situation where if a citizen looks at a picture of the prime minister in an angry manner, he could be detained immediately.

A much more favourable option of tomorrow is the further development of the actual democratic model, also known as the “data enabled democracy”. In this world, both citizens’ and government’s “data consciousness” are improved and both come to a joint agreement about the utilisation of personal and non-personal data assets of the nation. Although part of the personal freedom might dissolve, it is compensated by the benefits of a more centralised data ecosystem and the better services it enables.

We don’t know yet which model will succeed – maybe something in between. One thing we know for sure: governments should define their dataism path and embrace action.

The interview with Hungary-based Zoltan Tanács is part of a series of interviews with leaders from Cordence Worldwide on the digital future of government services.

Related: The European governments with the best digital services.