In February 2026, the NSW Parliament passed the Work Health and Safety Amendment (Digital Work Systems) Act 2026 (NSW) (the Act), marking a significant shift in how workplace health and safety laws apply to technology in the workplace. The Act amends the Work Health and Safety Act 2011 (NSW) to make it explicit that employers must ensure their use of digital work systems—including artificial intelligence, algorithms, automation and online platforms—does not put workers’ health and safety at risk.
While employers have long been subject to broad WHS obligations that extend to systems of work, the reforms remove any remaining doubt that harm caused by technology is firmly within scope. Of particular significance is the Act's focus on digital systems that allocate work performance or track workers, and the psychosocial risks that can arise from their use. For employers increasingly reliant on automated rostering, AI-driven scheduling and digital performance management tools, the changes are likely to increase regulatory scrutiny and elevate compliance expectations.
The Act does three main things:
Employers should make sure their WHS risk assessment processes clearly cover digital work systems—particularly systems that allocate work—and that any identified risks are properly controlled and regularly reviewed.
The Act builds on the existing WHS framework. Under section 19 of the WHS Act, businesses already have a duty to ensure workers’ health and safety so far as reasonably practicable.
The Act now makes it explicit that this duty includes ensuring workers are not put at risk by the use of digital work systems. These are defined broadly and include algorithms, artificial intelligence, automation and online platforms.
In practice, this captures most modern workplace technologies, including systems that allocate tasks, monitor performance, track workers or manage workflows. Risks linked to these systems can include psychosocial issues, such as anxiety about job security or constant performance monitoring.
The Act also introduces specific obligations where work is allocated through digital systems, such as automated rostering tools, gig economy platforms or AI-driven scheduling systems.
While these changes are aimed largely at gig economy work, they apply much more broadly. Employers must ensure that work allocated through digital systems does not put workers’ health or safety at risk. These risks may arise even where systems are designed to improve efficiency or consistency.
In doing so, employers must consider whether digital work allocation creates risks such as:
This means employers need to ensure automated or AI-driven systems, including rostering systems operate fairly and without bias. A strong AI governance framework is essential. Where third-party software is used, employers will need to carry out appropriate due diligence on those systems to, amongst other things, ensure that they are free from potential bias.
The Act gives WHS entry permit holders new powers to require reasonable assistance to access and inspect digital work systems where a breach of WHS laws is suspected.
At least 48 hours’ notice (and no more than 14 days) must be given before these powers are exercised.
This is the most controversial change. Allowing access to digital systems may expose sensitive personal data and commercially confidential information, and could also raise cybersecurity risks. Employers must balance the access right with compliance with obligations under privacy legislation and commercial confidentiality requirements. Employers will need to carefully manage these competing obligations.
The expanded right of entry will not commence until government guidelines are released. The government has indicated it will consult with businesses before finalising those guidelines.
The Act does not create a new standalone regime. Instead, it clarifies that existing WHS duties clearly extend to digital systems of work.
Employers can no longer argue that an AI system or algorithm falls outside WHS obligations simply because the system or solution is automated and not not controlled directly by the employer (including where AI solutions are provided by a third party). Digital systems must be treated like any other part of the work environment, and employers must be able to demonstrate knowledge of how digital systems work and associated risks.
Risk assessments should be updated to expressly cover digital work systems, particularly those that allocate work. Given how quickly technology evolves, these assessments should be refreshed regularly and whenever new systems are introduced.
Employers should:
Sally Woodward
Partner, Legal Leader, PwC Australia
Bryony Binns
Legal Partner, Workplace Law, PwC Australia
Natalie Perrin
Legal Partner, Workplace Law, PwC Australia
Jackie Ntatsopoulos
Managing Director, Workplace Law, PwC Australia