Close Search
 
MEDIA, JOBS & RESOURCES for the COMMON GOOD
Changemaker  |  Careers

Possibilities of the Humanitarian Robot


26 November 2018 at 8:52 am
Maggie Coggan
With technology playing a bigger part in our lives everyday, Lizzie O’Shea is combining her work with Digital Rights Watch, and as a humanitarian lawyer, to ensure the digital world is as ethical and and democratic as it can be. She is this week’s Changemaker.


Maggie Coggan | 26 November 2018 at 8:52 am


0 Comments


 Print
Possibilities of the Humanitarian Robot
26 November 2018 at 8:52 am

With technology playing a bigger part in our lives everyday, Lizzie O’Shea is combining her work with Digital Rights Watch, and as a humanitarian lawyer, to ensure the digital world is as ethical and and democratic as it can be. She is this week’s Changemaker.

The automation of machines, welfare, government programs and whole industries is moving at a pace the average person finds hard to comprehend.  

Everyday, people around the world give away their personal details when signing up to things that make their lives easier, trusting that this information and these programs will serve them in the way it says it will.

But how this information is being used is coming under scrutiny, as unethical and unfair practices used by tech companies target some of the most vulnerable in society.

O’Shea believes there is a way to use technology for good however, and is on a mission to expose and educate people about the humanitarian side of the digital world.

In this week’s Changemaker, O’Shea discusses the human influence on making technology, keeping people to account, and the democratic underpinnings of technology.   

You’ve got a background in humanitarian law, what sparked your interest in digital rights?

I’ve been interested in issues around digital rights and privacy for a long time. The idea of privacy has also taken on different layers of meaning in the digital age, and has become synonymous with freedom, autonomy and self determination, and we need to be very determined about how we try to protect it. I was a founding board member of Digital Rights Watch, which formed in the wake of the passage of the Metadata Retention Bill, which now sees Australia have a scheme that retains huge amounts of data about people. The role we played was scrutinising how that data was being used by government agencies.  

How do digital rights and human rights interconnect?

I see all the time how important digital rights are in all the different work I do, and it seems to be something that sits above almost the entirety of civil society, because all sorts of civil society organisations are also digital organisations, so protection of digital rights is really critical to all of them.

It makes sense therefore to work with others from this space who are advocating for human rights, as they exist in a digital space.

I’ve also always had an interest in whistleblowers and public interest journalism and watching how the digital age has shaped that through the whistleblowing of someone like Chelsea Manning and Edward Snowden. It highlighted to me how critical it is for the health of our democracy to make sure their digital rights were protected, or that we at least understood human rights in digital terms as well.

In your work, when have you seen technology impact people’s lives in an oppressive way?

The way online ads are delivered have the potential to be hugely oppressive. For example, payday lenders use a sophisticated process to target their clientele, which is based on search terms that you might use in Google. That’s used to create an understanding of the demographic type that make use of payday lenders, down to race, income, and geographic location. All these things that you might innocently provide in your profile, to be able to go about your daily business, can then be repurposed against to push to you in a predatory product in the financial space.

Part of what really alarms me about this is when public interest journalists go to the companies and try and expose this kind of misconduct, their answers are usually that they have automated categories that their systems have created. It’s just careless and thoughtless, and they haven’t planned for how we could use this space as a way to fight racist assumptions, or open up interaction between people across social and political divides. it’s like we’re treating it as a ghost in the machine, something that has its own autonomy, when in fact it’s built by people so we should do that with intention, and consciously and to let these opportunities slide is to forgo an incredible opportunity to incorporate democratic and non-discriminatory principles into technology.

Is it hard to get people to take these issues seriously when sometimes human rights abuses via technology are harder to understand or pinpoint?

I think there’s an assumption it’s difficult, which is gradually being disproved. We ran a campaign at Digital Rights Watch, encouraging people to opt out of My Health Record, as well as putting forward a number of changes to the program. Around 1 million people opted out, because they were concerned about how their information would be used.

Obviously health information is some of the most sensitive information that someone could hold about a person so it makes sense people are cautious about it, but they took the time to opt out, which is really encouraging. We are starting to see people engage in this debate.

The other one was our campaign around the proposed encryption bill, that creates a bunch of powers for law enforcement agencies to circumvent encryption. That sounds very nerdish, and hard for the public to understand. But we worked with the juice media to create an explainer video around that, and why it was such a risk to our digital infrastructure, and we managed to get 14,000 signatures from the public on this topic off the back of that. I think if you ask the question and put it in a way that’s easily digestible, you can have a really big impact.

What are some emerging issues you are seeing?

We rely very heavily on digital infrastructure protected by encryption to do things like shopping, the power grid, and mass transit. There’s a movement of the surveillance state to weaken that encryption which puts us at risk of nefarious actors attacking our system, including state sponsored terrorists. There’s a sense that the surveillance state is putting its own interests above the interests of the public, so that’s a concern. We are talking with government about that at the moment. There’s other things like discriminatory treatment by automated decision making processes, and how we can find ways to properly regulate it, and machine learning so it does use principles of non-discrimination and equality rather than blindly transferring some of the oppression we see in the real world, into cyberspace. That’s going to be a critical issue and it’s really hard to detect, so it’s really important to work out how we create standards, and how we can import human rights into machine learning and automated decision making.

I’d like to think that within the next five years people are making some good use out of the power of the digital age as well to collaborate and build a digital infrastructure for civil society organisations, that doesn’t result in huge data transfers for private companies and allows people to communicate with their communities in ways that are ethical.

Do you engage both sides, humanitarian and tech, to work on this issue?

Of course, it’s very important to speak to people whom this technology is going to affect at the pointy end, and I think that really informs how the advocacy will work. I talked to someone who had a chronic health condition when advocating around changes to My Health Record. The woman I spoke to was torn by it because she felt it would make doctors lives easier, but she was also extremely worried about the fact that insurers might be able to access her information at some point, and was terrified she couldn’t access travel insurance because of her pre-existing condition. We also talked to other civil society organisation that might talk to those people at the coal face, so we have relationships with people like Amnesty International.

We did build alliances with tech companies around the encryption issue, and while they are not our usual allies, they shared concerns with us. Partnerships are really important for us, because you need to speak to a range of people to make sure your advocacy is informed.

Technology is developed at such a rapid rate, are you worried that you won’t be able to keep up in terms of advocating for regulation?

Of course it’s a worry, it’s about raising people’s digital literacy so they understand some of these proposals. Especially in contexts where governments aren’t particularly interested in educating the public about changes. I think we are also going through a period of generational change, where a whole younger generation of digital natives have grown up fully alive to both the potentials and perils of the digital age. If you’re over 35, understanding the implications of technology can seem daunting and be very frightening but people who are younger are totally immersed in it, and that gives them a perspective which is quite wise. We will see a generation of people who are much more sceptical of machines, and don’t always assume they are right, which is a good perspective to bring into the 21st century.

I’m hoping that will change the debate for the better, and we will have many more people who are fierce advocates for human rights in digital spaces that understand the stakes.

Do you think we will see a movement where tech companies will have to market themselves as more ethical in the same way we are seeing the corporate social responsibility movement?

I think that’s definitely a factor they consider when they build their business strategy, but I think there’s extremely powerful companies who do everything they can to obscure the bad things they do, or the less appealing side of their business practices so they can continue to retain people as consumers. At the moment there are plenty of companies that like to promote themselves as good corporate citizens while still doing things that contribute to an undemocratic and unfair society. Holding them accountable as consumers and users is one way in which we can resist it, but I also think organising of workers is another critical way to resist because those people know what’s actually going on within the company. We could also try to build a culture of technology for peace, and for democracy and justice, and build a community of people who work on those kinds of projects and draw them away from private enterprise. I think that would be an amazing thing. I’m all for companies being held accountable for bad practices, and we need to think more broadly than just looking to tinker their PR messaging, we need to look into the deeper structures and into the bad practices in a digital space.

How has working in this space changed you?

I think it has taught me that technology is an overlay that influences all our human relationships. The plight of people crossing borders is connected to the development of technology, national security issues and people who get targeted for terrorism offences or people who are poor, and subjected to robodebt.

Learning more about this, through advocacy and writing is something that’s really helped how I understand human rights, and how they’re enforced, abused and exercised in the real world, and in the 21st century. It’s allowed me to engage with people who I think are quite critical to our future, who are interested in digital technology and who haven’t had a tradition of thinking about human rights or progressive traditions, and might benefit from cross-germination with people who’ve spent lots of time working in the human rights space.

We need to build a bridge between people who are interested in technology but don’t have much background in human rights, and people who have lots of experience in human rights, in particular communities trying to exercise them, so we might be able to improve the technology overlay that exists with that work.  I feel like I belong in both those worlds, and I can see both sides. By encouraging greater collaboration across the divide of humanitarian issues and technology, a future for building a better understanding of human rights is definitely possible.  


Maggie Coggan  |  Journalist  |  @MaggieCoggan

Maggie Coggan is a journalist at Pro Bono News covering the social sector.


Get more stories like this

FREE SOCIAL
SECTOR NEWS


YOU MAY ALSO LIKE

How to Overcome Fear and Self-Doubt During a Career Transition

Kerri Hansen

Friday, 6th December 2024 at 9:00 am

Tips to Polish Your Resumé

Johnson Recruitment

Friday, 29th November 2024 at 9:00 am

Nurturing positivity in your social care role

Jenny Rosser

Friday, 22nd November 2024 at 9:00 am

5 Trends Influencing Hiring Recruitment Strategies Within Health Organisations

Johnson Recruitment

Friday, 15th November 2024 at 9:00 am

pba inverse logo
Subscribe Twitter Facebook
×