Lyria Bennett Moses

We need flexible legal and policy responses. It's important not to pretend we can forecast the future.
Disruptive technologies pose new ethical concerns and unforeseen risks. Lyria Bennett Moses is helping policymakers design effective legal frameworks that protect us from harm without hindering innovation.

Autonomous weapons, editing the genes of human embryos and mass internet and communications surveillance are all examples of emerging technologies balancing on an ethical tightrope. 

While innovation in robotics, medicine and communications could yield major benefits to people around the world, it also poses significant risks to our safety, security and privacy.

Legislators need to draft effective laws to protect our rights, with enough flexibility to keep us safe from risks and new technologies that might arise in future, without being overly prohibitive as to thwart beneficial innovations. 

This is a major challenge, however, as laws and regulations are not designed to look ahead but are crafted for the technological possibilities at the time they’re introduced.

“We need flexible legal and policy responses. It's important not to pretend we can forecast the future,” says Associate Professor Lyria Bennett Moses from UNSW Law, who studies the complex relationship between the law and technology. 

“What we can be absolutely sure about is we will see new technologies emerging, and some of these will pose challenges.”

Bennett Moses is developing innovative guidelines for the drafting of legal frameworks to govern new technologies and industries, which can effectively manage potential risks. 

“We have to move beyond the idea that legislation should be technologically neutral, or the opposite, that we need to always regulate specific technologies,” she says.

Rather than a simple matter of law keeping pace with technology, Bennett Moses says it's about lawmakers engaging with new technologies and asking appropriate questions about new capabilities and technologies, the pace of change, and the potential problems and risks that could arise over the short and long-term.

“People often talk about technology and law as the hare and the tortoise – that law has to constantly play catch up – and that’s not a helpful metaphor for what is going on,” she says. “Different circumstances require different responses.”

One of her current projects, conducted in collaboration with the Data to Decisions Cooperative Research Centre, is looking at the way big data is being used by intelligence agencies and law enforcement to investigate persons of interest, to prioritise threats, to identify crime hot spots and inform the deployment of personnel.

This raises a range of concerns about our privacy, the legitimacy of surveillance practices, the people making decisions about our security, and what happens when those decision-makers become too reliant on algorithms, without fully understanding the technology themselves, she says. 

“Our laws don’t need to enable or prohibit these technologies, but we need to engage with their new capacity and think very carefully about how they’ll need to change,” she says. “Do we want to amend existing rules? Repeal laws? Do we need new laws? And should these regulate the technology or deal with a broader problem?”