The robot-led solution helping marginalised communities find legal help
1 March 2022 at 8:07 am
An AI project is aiming to dismantle the barriers getting in the way of vulnerable people seeking legal help
Unless you’re a trained lawyer, understanding the legal jargon you need to find help can be tricky.
For marginalised groups who struggle to correctly articulate their legal problem, the barriers to finding representation can be nearly impossible. But it’s hoped a new AI tool will be able to help tackle the problem.
The project, led by the not-for-profit legal service Justice Connect, involves building an automated language processor that can understand people’s everyday language and correctly diagnose their legal problem.
Roj Amedi, Justice Connect’s head of engagement, told Pro Bono News that one of the biggest barriers that vulnerable people seeking legal services faced was finding the right help.
“Usually, when someone applies for legal help, they’ll use the language that is most natural to them, but currently, they’re facing application processes that are super jargonistic and complicated,” Amedi said.
“We believe it’s the responsibility of organisations like ours to close that gap and understand people in their own language, rather than force them to try and decipher a really technical legal language.”
The voices of marginalised groups to be front and centre
AI language processors are often developed without marginalised communities in mind, overlooking the different ways these communities might engage in technology.
To combat this, the AI specialist team behind the project are working alongside diverse groups including older people, people with disability, First Nations people, people without tertiary qualifications and people from culturally and linguistically diverse backgrounds to collect language samples to understand the way people from diverse backgrounds use syntax, grammar, shorthand and slang to describe their problem.
They are also incorporating the ethical AI and inclusive technology best practice principles released by the Australian Human Rights Commission, focused on eliminating bias in decision-making AI algorithms and ensuring that AI includes human rights principles by design.
Around 250 pro-bono lawyers have so far worked through over 9,000 of these language samples, making over 90,000 annotations in several ways to ensure accuracy.
This means that when a person in need of legal help makes an inquiry, the AI tool will translate the request and direct them to the service they need.
Amedi said that this will mean that no matter your background, ability or level of education, people will be able to access legal help.
“People should be able to use whatever type of language that they feel most comfortable to reach out for help,” she said.
“So whether it’s to understand which service that they are able to access within an organisation, or to understand the services they’re not eligible for… it’s really about lessening the burden and the weight and the delay and the unknown of applying for legal help.”
While the project is still in its early stages, once developed it will be made freely available to anyone, and can be rolled out across any legal intake process, whether its an online form or a phone line.
A prototype is expected to be released in 12 months time.
Find out more about the project here.