Cyber Security: Director responsibilities in a changing legislative environment

For many years, cyber security risks have been signposted as a critical and emerging priority for Boards. Now, the Federal Government is calling for views on cyber security governance standards – including potential regulatory changes – designed to incentivise businesses to invest in cybersecurity.

This is a critical juncture for directors. One which could have serious ramifications for the operations of Australian Boards. This discussion was led by:

  • Rachael Falk, CEO of Cyber Security Cooperative Research Centre (CSCRC) and member of the Cyber Security Industry Advisory Committee
  • Dr Rosemary Langford, Associate Professor with the Melbourne Law School, University of Melbourne, specialising in Corporations Law and corporate governance.
  • Cameron Whittfield PwC’s Head of Digital and Technology Law, Honorary Senior Fellow, Melbourne Law School

Key points for Directors

  • Understand your current cyber resilience / maturity baseline. Measure yourself against respected standards.
  • Uplift cyber resilience to ensure appropriate maturity and consistently measure and retest.
  • Stress test incident response and business continuity plans (both at a board & company level)
  • Confirm and stress test cyber compliance
  • Consider cyber (tech) audit / advisory committee (or specialist Board support)
  • Know your data holdings and confirm appropriate management of those.

Cameron Whittfield, PwC’s Head of Digital and Technology Law

Relevant context

When it comes to cyber, most non-executive directors are well aware of the risks and challenges. But recently, we’ve seen cyber security attacks become incredibly sophisticated. Cyber criminals are incredibly agile and operate in lawless environments that afford higher levels of agility than corporate Australia can contend with. The cost is around $29 billion a year to Australian businesses and $7 trillion globally. When the CEO of one of our largest banks announced 260 million cyber attacks in just one quarter, that illustrates just how precarious the situation is.

There has been some fatigue around the issue. It's been in the top three issues for CEOs and boards for a number of years. COVID-19 has exacerbated the situation as we moved to remote working and conducting multi year transformations in a matter of days or weeks. That’s made us particularly vulnerable at this time.

Without being alarmist, the cyber security incidents we see each day create a pretty sobering picture. We've seen companies experience significant disruption, IT systems shutting down for material lengths of time, customers and suppliers leaving, litigation,  major transactions and deals falling through midstream or at least getting delayed, regulatory investigations and executives losing their jobs. The threat is real and it’s personal.

The past year has been tumultuous for high profile cybersecurity incidents. There has been a shift in government approach to naming and shaming state-based actors. We’ve also seen more coming out of the US on the back of the Colonial Pipeline ransomware attack with the Biden administration investing heavily in cyber reslience.

The regulatory landscape is complex. The Office of the Australian Information Commissioner is largely focused on breaches of privacy and personal information. The Australian Competition & Consumer Commission (ACCC) has also become active looking at misleading and / or deceptive conduct. ASIC has commenced proceedings under the Corporations Act. APRA has released statements that it will be holding boards and management directly accountable.

Australia’s Cyber Security Strategy 2020

Released in August 2020, the Australia Cyber Security Strategy is a positive step. The Government is looking to invest $1.67 billion over the next 10 years to uplift cyber resilience across business, government and the community. However, there was one line of text, that you could easily miss, which refers directly to director duty reform. The prospect of reform of directors’ duties has caused a little bit of angst and regularly gets a run in the media. But it’s not new and has been foreshadowed for some time.

Strengthening Australia’s cybersecurity regulations

The issue is coming to a head through the work of the Department of Home Affairs’ discussion paper, Strengthening Australia’s cyber security regulations and incentives, with submissions due on 27 August. The underlying tone makes it clear that the Government has lost confidence in our ability to build cyber resilience on our own. Whether that's misplaced in the current environment or not, that's the position which underpins potential reform.

The Government is also releasing critical infrastructure legislation. However, there's a big gap between those companies covered by that legislation and most other businesses. They need to ensure all businesses have an uplift in their cyber resilience, so the key focus is on boards. They want to find a way to incentivise directors and management to uplift the cyber security of the companies they sit on.

Some of the statistics that are driving the discussion paper are dated (2017-18), at least in terms of cyber awareness. We have witnessed a material change since then in cyber understanding and resilience at the board level. We expect some robust discussion around this consultation paper. PwC will be making a submission as I’m sure many of you may be as well either individually or through your Director communities.

Dr Rosemary Langford, Associate Professor with the Melbourne Law School, University of Melbourne

Directors duties and the reform agenda.

The increasing frequency, scale and sophistication of cyber security incidents and costs, not just in pecuniary terms, but to reputation as well, have become increasingly important in terms of directors’ duties and legal standards. Managing cyber risks is now a core governance concern.

In terms of the relevant legal duties, the most relevant is Section 180(1) of the Corporations Act - a duty of care and diligence.

(1)  A director or other officer of a corporation must exercise their powers and discharge their duties with the degree of care and diligence that a reasonable person would exercise if they:

(a)  were a director or officer of a corporation in the corporation's circumstances; and

(b)  occupied the office held by, and had the same responsibilities within the corporation as, the director or officer.

When this duty is applied it takes into account the circumstances of the company and the position and responsibilities of the relevant director or officer. I refer to it as a shifting objective standard. So the standard will vary depending on the type of company - that could include whether it's proprietary or publicly listed or unlisted, the size and nature of the business, the risks it faces, composition of the board and the way it's set up. It also varies depending on the position and actual responsibilities of the director or officer. This includes whether a director is executive or non-executive and the director’s experience and skills, although there are minimum standards and directors are required to take active steps to guide and monitor the company and to keep informed. There's no one size fits all standard, but there are minimum standards required.

How might this be relevant in the context of cybersecurity? If, for example, directors failed to set up proper standards of cyber security to be implemented by management, for the protection of the company's business, there could be exposure under Section 180. ASIC has issued statements on cyber guidance, emphasising the importance of active engagement by the board in managing cyber risks. Further, judicial expectations of what a reasonable director might do to oversee the management of cyber risks are likely to increase.

A second way directors might be potentially liable under Section 180 is via what's known as stepping stones. Stepping stones liability arises where a company breaches or potentially breaches the law, particularly the Corporations Act, and the director is found to have failed to exercise reasonable care and diligence under Section 180 in causing or allowing the breach or failing to prevent the company from breaching the law. In my view, stepping stones liability is just a straightforward application of the duty of care in Section 180. It's not a separate mode of liability and this view has also been taken by the full Federal Court in the Cassimatis Litigation. 

How is the stepping stones liability relevant to cyber? It is plausible that such an action could be brought in sufficiently serious cases involving cyber security incidents.  For example, if directors turned a blind eye to cyber security management, and the company was in breach or serious breach of data security or privacy laws, then it could be argued that the directors had failed to comply with Section 180 of the Corporations Act in certain circumstances. 

ASIC likes using stepping stones liability. Studies show that it’s the most frequent approach used in Section 180 recently, particularly in the context of disclosure and misleading and deceptive conduct related to executive directors and public companies. Even without proposed governance standards, there is some exposure to breach of duty under Section 180, whether in relation to not setting up cyber security standards or stepping stones. At the same time it must be recognised that directors cannot prevent every breach by the company.

In terms of the reform agenda, the discussion paper Strengthening Australia’s cyber security regulations and incentives, presents two options (in addition to a “do nothing” option) for governance standards.

The first is a mandatory governance standard (or legal duty) which would require large businesses to achieve compliance within a specific timeframe. The other option suggested by the paper is a voluntary principles-based governance standard, co-designed with the industry and aligned with international standards. It is not clear to me from the discussion paper whether these standards are to be imposed on companies or on directors. It seems to me that the intention is to apply them to companies given that the paper talks about ‘if not why not’ compliance requirements in the ASX Principles of Corporate Governance. Even if the standard is imposed on companies rather than on directors there will be implications for directors’ duties.

I'm not in favor of a mandatory standard. I agree with the discussion paper that a mandatory standard is likely to be too onerous and costly. Directors are already overburdened by regulatory requirements and the spectre of personal liability, particularly when we measure that against comparable jurisdictions. Directors are subject to liability under so many different heads of legislation, and it makes it increasingly difficult for directors to stay on top of and comply with their legal obligations.

The second option is a voluntary standard, which I would favour, although I don't think it's necessary if the aim is to give rise to potential liability for directors. But it's not clear what the voluntary standard would entail and how effective it would be. Also, a voluntary standard wouldn't be without legal significance and could be considered by a court when determining whether failures relating to oversight of cyber risk constituted a breach of directors’ duties. It would also serve to increase expectations as to what a reasonable director would do to manage cyber risk. Finally, if a voluntary standard is enacted, it's important that it be developed in consultation with the industry and then subject to further consultation.

Rachael Falk, CEO of Cyber Security Cooperative Research Centre (CSCRC)

What is driving the Home Affairs Discussion Paper in terms of changes to corporate governance and cyber security?

In terms of a risk, cyber is just like every other business risk. But it's far more agile and dynamic. The risk you were measuring six months ago might have significantly evolved in the last few months. With ransomware, for example, the modes of operation have changed considerably over recent months. Boards could be finding themselves dealing with a slightly different risk from meeting to meeting.

How can boards better assess that risk and what is reasonable? From my perspective, with Section 180 and the inquiring minds we all bring to the boardroom, reasonableness frames everything. Whether it's cyber security risk, an OH&S risk or a people risk, it's what you did and whether that response was reasonable. You are aware of the risk, you assess the risk, and you make an informed decision as a board about an action or no action.

Some of the change (and I certainly don’t talk for Home Affairs) could, in part, be being driven by a growing impatience with what we call contributory negligence or negligent victims. This isn’t talked about publicly but when a company gets hit by ransomware, it's often dressed up as a highly complex or sophisticated cyber attack. But I think there could be some frustration around that narrative when there's no mandatory reporting scheme for how companies have to disclose breaches and consequences.

Andy Penn, who also chairs the current and the previous industry advisory panel, talked about the need for directors or companies to become more responsible for cyber, and flagged that again in his recent National Press Club address. Karen Andrews, the new Home Affairs Minister, said cyber security is very important to her, particularly ransomware. There’s very much a view of wanting to take action, but the devil is in the detail around what that means in the boardroom.

Over the last 10 years, having witnessed lots of cyber breaches, I must admit having sympathy for some of the victim organisations. But then I shake my head at the way some  victims have also contributed to their own breaches. In some sense, it does stop with the board and management. But having worked in larger organisations, I'm very aware that what happens on the shop floor and the dynamic nature of an operation versus the PowerPoint reports that go up to the board are often quite different. That’s partly because of a sanitisation process where management wants to minimise and respect the board's time. But that also strips out the key details of what the board needs to hear and see in order to be enquiring and make reasonable decisions.

How is cybersecurity risk different from other risks that NEDs need to consider and assess?

Many would consider the Centro case to be an outlier decision - with the greatest respect to the Federal Court- but it does emphasise the need to bring an enquiring mind when you're assessing risk. With cyber risk, a lot of directors get intimidated by the perceived technical nature of the subject. They can get a little nervous asking questions. In my experience when directors feel intimidated or remain silent, that’s when you can get into dangerous territory. In the Barings Bank collapse, one of the key findings was that management in Singapore and London felt they didn't understand the futures business and they couldn't really ask the questions for fear of being seen to not understand the business. The bottom line is understanding the business that you're a director in. It’s slightly different for other risks only because you have to know what questions to ask.

Doing what is legal and what is right - the practicalities of boardroom dynamics

In my experience as a lawyer, inevitably if the board makes a decision on what is right, that will be heading in the right direction. I’ve never seen companies given a hard time for having contemplated and discussed what is right. A lot of this is not prescribed. It may not be easy to work out the legal position. But if a board has considered a complex issue and asked management to implement it so that something is better, doing what is right will always trump, if not correlate with, what is legal. There is a bare minimum required for what is legal and some of that feeds back to Section 180. But from what I’ve seen as a litigator, if you’ve implemented training programs and yearly reviews, that’s always taken into account when examining what went on in the boardroom.

In the Equifax breach in the US, the Senate report highlighted a ‘culture of complacency’. Equifax was a credit reporting agency holding an incredible amount of valuable and highly sensitive data. But (for the most part) they didn't take cybersecurity seriously, particularly at a senior level and that trickled down into how they managed important operational risks that ultimately impacted the organisation, its customers and sometimes their non-customers.

Best Practice Guidelines as a way to “better characterise” voluntary standards, assessed in light of Section 180

My personal view is we are living with voluntary standards now. The law is the mandatory part and Section 180 will deal with the most egregious and negligent cases. Rather than voluntary I would suggest ‘best practice guidelines’ as a better way to describe it, because everyone wants to be on the pathway to best practice. Think about when a CISO is asking for budget to meet legal obligations and someone questions something that is actually voluntary. The CISO quickly finds themselves having to work around what was voluntary and what was mandatory in order to cut their budget accordingly. In my view, voluntary gets a bit of a bad rap but best practice guidelines are something we all want to pursue. I would be advocating best practice guidelines that feed back into Section 180. They're not too onerous. They're not a great impost. They will help boards and management better understand the risk and what they should be doing to manage it effectively. 


Should listed companies consider having a technology sub-committee in the same way as we have audit and risk or remuneration committees? Is this a way to better understand this risk and then bring the board on this journey?

RF: Audit and risk sub-committees usually handle this kind of risk. I know there would be some that would argue to bring the whole board along with you. In a large, complex, listed company, it could be part of the audit and risk committee to include somebody who is conversant with the risk, remembering that all directors should be literate to some degree. For organisations with complex cyber risks, I don't think it would hurt, particularly if we end up having best practice guidelines. A sub-committee could help a board understand what the risks might mean for the organisation and shepherd some of that work through and/or help fellow directors. It can't hurt, even if ultimately it just becomes part of audit and risk. Also, having that sub-committee or individual doesn’t let the rest of the board off the hook.

Best practice guidelines - one size fits all or a maturity model approach?

RF: I would say a maturity model, because you can’t have the same onerous requirements on a small business that you would on a listed company. Bearing in mind, the Privacy Act has a $3 million threshold but I don’t think there should be thresholds for small businesses. Even a small business can hold a lot of valuable data. And a director is a director, regardless of the size of the company. We need a sliding scale, and depending on the nature of the business, for example, critical infrastructure. 

CW: Boards need to understand what their company's current cyber resilience is. It's impossible to make investment decisions, if you're not already aware of the current standing. Whether you’re reviewed against the essential eight, an ISO standard, or the NIST framework, objective assessments are becoming increasingly more likely to come out of this review. That will not only enable boards to identify gaps, but to be more prepared for when reforms come through, because we expect they will try to have a one size fits all baseline to start from.

Once you know what your resilience is, the next thing is to hold management accountable to bring that resilience up to speed. Quite often, the temptation is to compare yourself against colleagues or competitors. That approach is fraught with danger, because that's not to say your competitors are running at best practice either. This is why Home Affairs is driving for objective standards, because one size fits all is tough given the diversity or companies. 

RL: When the discussion paper looks at governance standards, they're just looking at large businesses. I don't think they define it. There’s this concept of once you get to a certain size then the standards apply, but they don't really say much for anything below that size in that chapter. There might be more coming on that but the chapter on governance standards only talks about large businesses.

Director’s perspective: John Green, Deputy Chair QBE, Non-Executive Director of the Cyber Security Cooperative Research Centre and Challenger Limited

“I think the legal approach that's being taken by many, in particular the government, is wrong. Section 180 is a good instrument, it moulds with what's reasonable at the time having regard to the nature of your company. But as directors, there are three reasons we should be challenging the government if they want to impose narrower duties and hit us with sharper sticks:

  • Cyber is not an emerging risk. It's here and now. But the perpetrators' weapons are constantly emerging and changing. If you were full bottle good yesterday, you might be empty bottle bad tomorrow, depending on the tool that's parlayed against you.
  • Unlike many of the other risks we face in corporate Australia, cyber is completely asymmetric. It's like a war, we have to keep the bad guys out 100% of the time, they only have to get in once. And often the way they get in, no matter how good your systems are, is through human error.
  • Not all of us can be cyber experts as we sit in the boardroom. The government and the regulators need to recognise that no matter what we do, we are likely to fail. Even the best companies in the world fail. Only a few weeks ago, Microsoft failed with the Solar Winds hack. If Microsoft can fail, any company can fail. We have to accept that this is a risk, we have to mitigate against it intelligently, as best we can, but don't impose extra legal obligations. We should be focusing on reputation risk, which is what we do all the time.

Hitting directors with sharp sticks is not the way to go, we need better carrots. From ‘how did they get that wrong?’ to ‘was the board asleep at the wheel?’ - the name and shame response is a bad way to deal with this issue. Instead we should share our experiences with each other. When something happens, we let everyone know what's going on, so others can know about it. I’m all in favour of voluntary codes.

Further, if you accept the thesis that an incident is likely to happen to some of us, no matter what we do, the question is, what do we do as a directors with that piece of information? You need to engage on your board differently than perhaps you are doing now. Talk about the issue frequently. When you see a new incident ask how that would impact us; would our systems have caught that; what damage could it do? I was in a board meeting recently discussing ransomware and management said they were planning an off-site, to run through it and develop a plan. But what about the board's view? Paying ransom is a major decision, in some countries illegal, or certainly unethical, in many respects in terms of moral hazard. The board needs to get its hands a little bit dirty in these decisions. They need to be more engaged. And directors need protection in the form of carrots and not sticks.”

RL: In terms of submissions, if the aim is to increase resilience and capabilities of boards, then it's not just by sticks or carrots, it's by better training. So if people are making submissions, that's a good point to make.

We're talking about directors, but a lot of this relies on executives. Aside from CISOs, do you think the general executive community is up to speed on cyber?

RF: My experience is we still have a long way to go. In most companies, they want to get the deal done. Cyber can slow down the deal. You're often fighting your own procurement department which is looking for the lowest unit cost price. We don't talk about training our procurement departments to recognise cyber risk or consider if the lowest unit vendor is also the safest for your organisation. I ran the first cyber influence team at Telstra and the biggest obstacle was the internal people who didn't want to change what they were doing. And as you would all know, this all comes from the chair, the board, and the CEO. If the executive sees that, then they will start to understand that it matters. If there's a complacency or aim to ‘just get the deal done’ that spreads right through an organisation.

CW: I think sometimes these interrogations can feel personal, but they’re just like any audit. The moment you're subject to some sort of auditing everyone gets their back up a little and self preservation comes in. How do we encourage people to be more open around these issues? Management looks after your third party arrangements, and that’s where security can be a lot weaker than your own. There are very large companies with incredibly robust systems and complicated supply chain ecosystems. You only have to look one step removed from the inner sanctum and security settings can start to degrade.

The ASX corporate governance principles have been helpful in driving behavior. Is there an expectation for directors on that journey to have a greater appreciation of the risk and addressing it?

RL: I actually missed it at first when I read the discussion paper and it's going to depend on what the standards are. The ‘if not, why not’ approach is trying to support the voluntary standard, but if the emphasis is on reporting, there are problems with that. It’s not appropriate for companies to disclose sensitive information in relation to their cyber security measures and it’s not a good idea to have reporting for reporting's sake. We need to wait for more detail, there was only one sentence in the discussion paper.

On the basis that the cyber risk management framework is in place, one of the greatest exposures is people doing something inadvertently silly (opening links etc). How can you best get a handle around culture because regardless of where on the maturity model you sit, this is one of the biggest areas of exposure?

RF: In the past, I have advocated for a cyber audit obligation, much like a financial audit obligation. If you are wondering what is going on in your organisation, commission a third party audit by a cyber security incident response firm. Often an independent audit  will come back with some quite alarming or disturbing findings but that can be very useful. As good as your CSOs and CISOs are, they often don't see what's on the shop floor, because they're busy doing their own role. Doing an annual audit also goes to the  reasonableness test that the board has gone to lengths to satisfy itself that it understands its corporate risk. It will show how well you are training your people, that they are not clicking on links, not using USBs. But don’t let it turn into another tick and flick exercise. It has to be a meaningful review with a clear scope and can’t be controlled by the CSO and the CISO. It should help the organisation better understand the level of risk and exposure it has.

Director’s perspective: David Fairman, CSO Netskope

“In terms of the external audit, I agree with that, to some extent. The trouble with external audits is they are a point in time. How do you get a real time continuous control assurance model? You also need to make sure that you have confidence in that data. The thing I would caution is security teams and operational staff are already being audited by controls teams, external auditors and regulators. The more audits you have, the busier they will be responding to audits versus doing the job at hand. You need to find the right balance.

Benchmarking is an interesting data point, but that's all it is. Every organisation is different, with different levels of investment and maturity. Use the benchmark as a target state but don't use it necessarily as a measure of how effective your program is operating. That will take time to build out and you're never done in this space, you should never think you're done. Recognise the constraints in resources and budget that you need to balance. So use it as a data point, but not the be all and end all.

When you look at it in the broader operational risk context, quantify the risk to a quasi dollar value, not a high/medium/low. If you're getting reports of high/medium/low risk for cyber what does that actually mean for your operations. Bring it back to something that's impactful and commercially available for you to have a conversation around. And you can balance that against the other operational risk, it is a trade off conversation.”

Our read of the room

By way of summary, the key themes that we are hearing in our conversations with directors and stakeholders include:

  • There is very little desire for mandatory governance standards coming out of this consulting process
  • Also little desire for a ‘do nothing’ approach. We can expect something in the middle ground
  • There are conflicting approaches, or viewpoints between directors and management. Some CISOs are pointing towards more board accountability and more investment required
  • Voluntary standards are getting the most traction. The general consensus is that current director obligations are sufficient and agile (but boards would benefit from guidance)
  • Some consideration is being given to different standards for different director types e.g. executive against non-executive
  • Relative resilience is becoming less relevant. An objective standard is being considered by the government but yet to be decided
  • Standards will need to be complementary to duties. It's not going to be a new journey - we don't expect amendments to the Corporations Act.

Contact us

PwC Australia

General enquiries, PwC Australia

Tel: +61 2 8266 0000

Follow PwC Australia