Skip to main content

We built a disability AI assistant

· 5 min read

Simply Disability banner

For the last couple of months we've been burning some serious midnight oil to be able to share with you an awesome project we've been working on. In partnership with YouAI, we have built and are releasing to you today, an AI Assistant designed specifically to help poeple with a disability, their families and support network.

Our story

Over the last 8 months or so we have been neck deep dealing with the NDIS through the ATT. We found ourselves spending hours researching and reading an enormous amount of information. Doing our best to make sense of the NDIS Act, rules, policies, processes, interpretations, public freedom of information releases, published AAT cases involving the NDIS and other relevant legislation (federal and state (WA) based).

We are both career knowledge workers and in-spite of our skills and experience, we often felt overwhelmed by it all. To help us get some quick answers to questions, we built an app that uses AI to find answers to questions based exclusively on the documents in our repository.

Whilst not perfect, it helps us to be more efficient. We believe other families should also have access, so today we are making it available to you. You can try it for free and if it meets your needs, continue using it for a small fee to cover the cost of using third party AI service providers (OpenAI, Anthropic, Mistral AI).

What does it know about?

Our focus has been on the NDIS Act and some associated documents. We have also loading in the NDIS Review that was released on 7 December 2023 as well as a curated set of 128 documents published via freedom of information requests covering internal NDIS policies and procedures. This means you'll be able to ask questions regarding the NDIS Act, rules and/or the NDIS Review, and get responses out based exclusively on the content of these documents. You can also ask the same questions aimed at the FOI documents which has been really helpful for us to get more specific answers that the general/ambiguous information available on their websites.

Our next priority is to expand the scope of information available to our AI Assistant so that it is more generally useful. In the interim, we have built in a set of features that allow you to upload your own file, scrape content from a website using a URL, or simply pasting chunks of text into a text box.. and then asking the AI Assistant question about the content. Its a great workaround while we progressively add more source information to our knowledge base.

The Technology

The technology that underpins the Simply Disability AI Assistant uses an advanced technique called Retrieval Augmented Generation (RAG). It is a multistage process where your question is used to find likely related information from a database containing a collection of documents we have curated. The extracted content is then combined with a set of engineered instructions along with your question, and sent to a Large Language Model (LLM) service provider (e.g. OpenAI, Anthropic, Mistral AI) for inference. Their LLM interprets the supplied content based on our instructions to answer your question and then generates the response you see in our AI Assistant's chat window.

To refine the specificity and quality of LLM responses, our prompt engineering instructs the LLM to cite and reference wherever possible and to say "I don't know" when the supplied information does not contain an answer to your question. This comes as a trade off against being able to have long fluid conversations. LLM's have a limit to the amount of words that can be included in each prompt. As such only content related to your initial question is supplied and any follow-up questions will be answered using that information. To expand the scope again, you just need to start a new chat using the "new" button.

A key benefit of RAG over chatting with LLMs like ChatGPT directly, is a significant improvement in the quality of the responses that are generated. Responses are generated based on supplied information rather than invented based on whatever information the LLM was built with. This allows for responses that are much more specific, accurate and contextually relevant. RAG takes advantage of a LLMs ability to interpret information whilst limiting the scope of information that is used to generate a response.

How we use it

We have a few ways we use the tools, but generally speaking the workflow is straight forward:

  • Throw your questions at the AI Assistant. Answers to your questions often inspire new questions. Eventually you may feel the need to read the source documents yourself.
  • Using the citations and references provided during your conversation with the AI Assistant, jump over to the Resources page, find the relevant source document and go straight to the relevant section to refine your understanding.

Given the responsiveness of the assistant, we have also found it useful to have a device with the app open nearby when having discussions with official representatives or highly opinionated individuals. If something doesn't sound right, or seems a bit wishy-washy, just ask the assistant. Sometimes it doesn't know the answer, other times it throws a killer quote you can use to help everyone see the forrest for the trees again.

We hope you find it useful and we welcome any thoughts or ideas you might have to improve the utility of our AI Assistant.

To jump straight into it go here: Simply Disability

Kind Regards,

FP