Close Search

ChatGPT, generative AI and the future of the for-purpose sector

18 January 2023 at 5:44 pm
Ruby Kraner-Tucci
For-purpose organisations are no stranger to artificial intelligence, but the capability of powerful new online chatbot ChatGPT is threatening to rewire how the entire sector operates.

Ruby Kraner-Tucci | 18 January 2023 at 5:44 pm


ChatGPT, generative AI and the future of the for-purpose sector
18 January 2023 at 5:44 pm

For-purpose organisations are no stranger to artificial intelligence, but the capability of powerful new online chatbot ChatGPT is threatening to rewire how the entire sector operates.

The for-purpose sector is no stranger to artificial intelligence (AI); it’s been used to interact with donors, find new employees, solve fundraising challenges and better support marginalised communities.

But the capability of the latest in AI technology is threatening to rewire how the sector operates, stirring both excitement and fear from industry leaders.

ChatGPT is a powerful new online chatbot that uses generative AI to produce sophisticated conversational text in response to just about any prompt, whether it’s to write a letter to a minister outlining the challenges of the NDIS or to prepare a speech on the future of philanthropy.

Head of Swinburne University’s Social Innovation Research Institute, which uses new technologies to solve complex social problems, professor Jane Farmer warns the sector not to be tempted into using this new wave of AI just yet.

There probably are uses of generative AI [that] not for profits might be really interested in, but don’t start there. Not for profits have to have their house in order and get their data capability together before getting into using AI,” Farmer told Pro Bono News.

“But by the same token, everything’s happening really quickly, AI is going to be everywhere, so they need to keep up-to-date.

“It’s a massive issue for the sector, and it’s a space that needs a lot of investment. Consequently, it’s potentially another force for corporatisation in the not-for-profit sector, which is not good.”

The future of work

ChatGPT uses generative AI technology – a type of AI that involves creating original content – and is trained through an extensive dataset of text, including books, articles and websites, to learn the patterns and structure of natural language, which helps it generate a human-like response.

Within a week of launching in November 2022, ChatGPT had one million hits and has since taken the world by storm, with some using the software to predict the future market value of Bitcoin, support people with poor mental health, and even write songs like legendary musician Nick Cave.

Worryingly, it’s also caused many to question the future of creative roles and educational institutions, and Farmer adds the for-purpose sector to the list of industries vulnerable to change as a direct result of this type of AI.

It will rewrite things, but it’s open as to how long it will take, who is going to do it, who will afford it and who will push back.

I think that what is likely to happen is that organisations looking to cut costs will look at what AI can take care of, and there is a danger that [the workplace] will end up potentially with a few people in the office doing horrendously full-on, intense jobs. That has a lot of implications.

“We need to be thinking about what this is going to do to jobs and education, and the kind of people that we need to produce. Organisations need to be having this discussion without scaring themselves.”

AI and philanthropy

Philanthropy Australia’s executive engagement director Adam Ognall is more positive about the impact of generative AI when it comes giving practices, arguing it could help to streamline grantmaking applications and better assist the scalability of programs, among other outcomes.

There are a myriad of ways these technologies could potentially impact giving,” Ognall told Pro Bono News.

“Whether [it] is to administer and track the results of funding; introduce new ways in digital spaces for participatory giving; and new and more digitally-embedded core infrastructure across all arenas, from governance and regulation to delivery of services.

“One way the sector may change is in its ability to use AI to test different approaches… The costs of funding pilots that test different language [that is] tailored to different audiences, for instance in health awareness campaigns, becomes more possible.”

A question of ethics

Yet like many, Ognall is sceptical about the ethicality of generative AI and the limitations of ChatGPT.

For example, while the chatbot has been designed to refuse inappropriate questions and avoid making up answers, it has no concept of truth, which makes it difficult to assess whether its responses are factually correct. ChatGPT is also yet to be updated to account for any information or data post 2021.

“Trust is a big question,” continued Ognall. “We can see in the early days of ChatGPT, accuracy and credibility is a bit wobbly. If we want to think big beyond what’s currently happening… there needs to be trust and a way for this to truly create better outcomes.

“There are a whole range of ethical and inclusion-related considerations that the sector needs to be thinking about… There is a role that philanthropy is uniquely placed to play in championing a values-led approach to AI. Donors and those supporting giving need to have sufficient resources to understand the potential application of AI and what benefits and risks it entails.”

For Farmer, the question of ethicality is focused more on equality of access. Right now, ChatGPT is a free and public service, however this is tipped to change, with a waitlist open for an upgraded version tentatively called ChatGPT Professional that comes with a cost. 

“There’s questions of equality and power in there as well. Will people who have the cash buy into authentic, human-delivered, human-curated systems and if you are poor, then you’ll just have to put up with a Centrelink [version of] AI,” said Farmer.

“I think it comes down to – is this something we should actually do or is it all just too expensive and complicated? Is there a bulk of [work] you can do with AI that’s going to make it cost effective, such as trying to target donors all over the world? Sure, that’s a quick and easy bit of AI. But if it becomes more complicated, if you’re a smaller organisation or it’s more of a niche problem, you’re probably not going to use it. 

“Is that going to affect which organisations survive in the long term? That’s the question.”

Ruby Kraner-Tucci  |  @ProBonoNews

Ruby Kraner-Tucci is a journalist, with a special interest in culture, community and social affairs. Reach her at


  • Bruno says:

    As a Victoria-based Career Counsellor, I have tried ChatGPT out on several questions, I or my clients Would be expected to ask:
    What is the average income in the US? Then in China, then in Australia?
    CG gave me different metrics (weekly, monthly and/or annual) and differently worded hedged answers for each. (Hard to determine, more factors needed etc.).

    I then asked a set of Questions about the Morrisby profile, which is made available to most Year 9 students in Victoria and psychometrically tests peoples’ underlying aptitudes, combined with other quizzes on career interests and personality type, to give a useful profile of a person’s suitability for different careers, their Learning style and lots of info on careers themselves, study plans etc.
    First Question:
    What are 10 ways I can use my Morrisby profile?
    CG quickly answered with 10 generic suggestions about how the profile can help Identify Jobs/prepare for interviews…….. that align/use your particular Strengths, Interests and values.
    Generic, but roughly correct.
    2nd Q: 10 other ways to use the profile?
    CG answered with more specific uses, (e.g. networking) that were not really all that useful.
    Both sets of answers had disclaimers, stating this was for general use of the profile and that users should consult with a career counsellor for individual advice.
    So far, not bad, not good.
    3rd Q: Then I asked about the different tests and quizzes used by Morrisbly to underpin the report.
    CG gave 2 or 3 nearly right answers (skills, personality) and several not correct answers (Knowledge).
    4th Q: I then asked CG to Break down the Skills assessment (actually aptitudes) itself. What type of questions, and what was actually tested, CG went right off. Physical skills, technical skills, and understanding of texts. None of these factors are tested in the Morrisby Assessment, and CG was drawing in very basic facts about career assessments in General and applying them to a very specific Assessment.
    I think it could very useful as a prompt for ideas or conversations on a topic. What the 10 safest places to go in Asia, etc?!!?

  • Myles McGegor-Lowndes says:

    There is a bigger story here 🙂

    There is a nonprofit lesson in its story… we should carefully regulate nonprofits turning into for profits and enforce the nondistribution constraint and asset lock to charitable purposes only. (But does this stultify progress? At least taxpayers money in the form of exemptions and gift deductions should be repaid?)

    Our Attorneys General (as the protector of Charities) are the prime regulators who rarely do anything active in relation to charities unless they are forced to do so, and even then, they don’t do anything meaningful for the most part. The ACNC does not primarily regulate organisations that cease to be on their register, -just taking charities off their register.

    OpenAI (owns CHaTGPT) pivoted from nonprofit to for-profit status in 2019, a mere four years after it was founded with $1 billion of donations from Elon Musk and others. It received tax exemptions and deductible charitable contributions that will never be recovered by the State (taxpayers).
    Open AI is now worth upwards of $30 billion.
    OpenAI is now reportedly in talks to raise $10 billion from Microsoft, much of which is likely to go straight into OpenAI shareholders’ pockets.
    Everybody who funded it while it was a nonprofit has the right to feel a bit aggrieved that they were unwittingly providing seed capital to make OpenAI executives even richer than they already were.


Improving your social impact reporting

Kevin Robbie

Monday, 29th May 2023 at 5:36 pm

Salary Survey reveals pay rises across the board

Danielle Kutchel

Monday, 29th May 2023 at 5:00 pm

Think Strategy: Think Impact

Kevin Robbie

Thursday, 20th April 2023 at 11:00 am

Helping the small guys have a big impact

Danielle Kutchel

Wednesday, 29th March 2023 at 4:40 pm

pba inverse logo
Subscribe Twitter Facebook