The Evidence Portal

Frequently asked questions

About the Portal

What is the Evidence Portal and why was it developed?

The Evidence Portal is a publicly available website that contains quality research evidence.

It has been established so that organisations delivering human services in NSW are able to access and apply evidence in the design and delivery of their programs.  For further information see About the Portal.

What is in the Evidence Portal?

Currently the Portal contains a number of types of evidence:

  1. Evidence Reviews - undertaken applying a very high evidence threshold to allow the development of core components (see below).
  2. Evidence-informed programs - the programs that were identified in our evidence reviews which have been shown to work for specific groups of people.
  3. Core components - are drawn from these evidence-informed programs.  Core Components are the fixed elements that make up the evidence-informed programs.  These can be used as the foundational building blocks for evidence-informed design of new services, or in the review of existing services.
  4. The Aboriginal Cultural Safety and Wellbeing Evidence Review - a standalone review undertaken to explore available evidence and gaps in relation to building culturally safe human services.

In the future the Portal will also contain other evidence and evaluations of emerging programs.

How is the Evidence Portal meant to be used?

The research evidence available on the Portal can be used in a number of ways, including the following:

  • To find information about programs that have been shown to achieve positive client outcomes.
  • To identify an evidence-informed program.
  • To find evidence for core components that can be used to design tailored and flexible services, and ways to implement them in a program (evidence on implementation is called 'flexible activities').
  • By using the core components to identify the key activities and services that need to be delivered to specific target groups, to achieve certain outcomes.
  • To review and adjust a service model that doesn't seem to be working.

What about practitioner expertise and client voice?

Research evidence is only one form of evidence.

Experienced practitioners have vital knowledge about the families, communities and the service systems within which they work.  Effective services will incorporate practitioner expertise, in both design, and particularly implementation.

Likewise, effective service design and implementation will reflect the lived experiences of clients, their values and preferences.  Incorporating client voice helps to prevent avoidable harm and results in better client outcomes.  Creating a space where clients' voices are heard and can directly influence service design, as well as the services they receive will ensure services are tailored and more likely to be accessed and effective.

Populating the Evidence Portal

How did you identify the evidence-informed programs that are on the Evidence Portal? 

The evidence-informed programs are identified through high quality threshold evidence reviews conducted on specific topics. Each evidence review follows a strict process to search for, screen and assess research and evaluations to identify high-quality, evidence-informed programs.  

See the Technical Specifications for more information about the process followed.

These specifications provide detailed guidance, explanations and examples to ensure our evidence reviews are systematic, rigorous and transparent.   

How do you rate a program’s effectiveness? 

The rating a program receives depends on: 

  • the number of studies that have evaluated the program
  • whether the outcomes reported in those studies are positive, negative or neutral. 

The approach for rating evidence considered and adapted the method of other publicly available evidence rating scales, including the Early Intervention Foundation Evidence Standards and the What Works Clearinghouse Procedures and Standards Handbook (Version 4.0) (United States Department of Education, 2017). 

See the Technical Specifications for more information about the Evidence Rating Scale.  

How do you identify core components?

Core components are extracted from evidence-informed programs. 

After an evidence review has identified and rated evidence-informed programs, core components can be extracted from them. This involves closely examining and grouping the types of activities (core components) that are undertaken as part of each program. The way these activities are implemented is also captured. These are the flexible activities within each core component.  

See Section 2.7 of the Technical Specifications for further detail and examples of identifying core components and flexible activities. 

See Using a core components approach for more information. 

Where can I find more information about the evidence reviews that were conducted?

Detailed information about each evidence review we have conducted is on our evidence reviews page

Alternatively, you can email us: EvidencePortal@dcj.nsw.gov.au 

How will the evidence on the portal be updated? 

We want to make sure the Evidence Portal includes the most up-to-date research. We plan on updating the information about evidence-informed programs and core components periodically to include new research and evaluations that have been conducted and published.

Core components

Why does the portal use a core components approach?

When working in diverse communities with complex circumstances and changing needs, it’s important we implement services that are flexible and tailored to local needs. An evidence base of purely manualised ‘off the shelf’ programs may inhibit such flexibility. In addition, the cost of manualised programs is often prohibitive. 

As such, a core components approach has been taken to organise the evidence in a way that is meaningful and easily applicable to existing programs and services.  For further information see Using a core components approach.

Evidence-informed programs

How does the ‘strength of evidence’ rating work?

Each program that meets the critera for inclusion is given a ‘strength of evidence’ rating. This helps us understand the quality and volume of evidence that sits behind each program. 

There are 7 different ratings a program can receive:

  • Well supported by research evidence
  • Supported research evidence
  • Promising research evidence
  • Mixed research evidence (with no adverse effects)
  • Mixed research evidence (with adverse effects)
  • Evidence fails to demonstrate effect
  • Evidence demonstrates adverse effects

For more information see: Understanding the Evidence Rating Scale.

Programs that are determined to have a positive effect on at least one client outcome are included on the Evidence Portal as evidence-informed programs.

What does ‘effectiveness’ mean?

Effectiveness refers to the ability of a program to achieve positive client outcomes. Each program in the Evidence Portal is identified has having a positive, negative, neutral or mixed effect on client outcomes.

A positive effect means the program was able to improve client outcomes – positive changes occurred.

A neutral effect means the program did not impact client outcomes – they stayed the same.

A negative effect means the program had an adverse effect on client outcomes – they got worse. 

A mixed effect means the client outcomes were a combination of positive, negative and/or neutral. 

The evidence portal only includes evidence-informed programs – that is, programs identified in the review that were found to have a positive effect on at least one client outcome.

Programs for which evidence fails to demonstrate effect or which evidence demonstrates adverse effects are not included in the Evidence Portal. 

Some programs have a supported or promising evidence rating. Does that mean I should implement it?

If the evidence rating for a program is ‘supported research evidence’ or ‘promising research evidence’, it means the evidence for those programs shows they can have a positive impact on client outcomes. We can be reasonably confident in programs identified as having ‘supported research evidence’ as these programs have had multiple evaluations that all show the positive impact of the program.  

However, it is important to remember that ‘one size does not fit all’.  Look at the target group for each program and see if it is similar to the clients you work with.  For example, do not assume that a program that has been evaluated with Anglo-Celtic families will work for Aboriginal families. 

Before you implement a program you should review the assessed needs of your clients, their goals, and the resources you have available. 

Some programs have mixed research evidence. What does that mean?

Programs can have mixed evidence if at least one client outcome was positive, another was neutral and/or another was negative. 

Programs with at least one negative outcome and a neutral or positive outcome are rated as ‘mixed research evidence (with adverse effects)’. Caution should be used in implementing these programs. This is because the program could have a negative impact on a particular outcome for your clients. However, we can still use information about these programs to understand what doesn’t work. 

Programs with a combination of positive and neutral outcomes are rated as ‘mixed research evidence (with no adverse effects)’. Caution should be used in implementing these programs also. You should carefully review the client outcomes the program can have a positive impact on and the outcomes it is unlikely to achieve.

Some programs are rated as 'evidence fails to demonstrate effect'. What does that mean?

A program is given the rating ‘evidence fails to demonstrate effect’ if an evaluation shows that the program did not have a positive or negative effect on client outcomes. These programs are not included on the Evidence Portal.

While the program may not be effective in the specific context in which it was evaluated, information about that program could still be useful to help us understand what does and doesn’t work for our clients. If the evidence shows that a program has no benefit, then it is recommended to consider alternative programs or activities.

Some programs are rated as 'evidence demonstrates adverse effects'. What does that mean?

A program is given the rating ‘evidence demonstrates adverse effects’ if an evaluation shows that the program only had a negative impact on client outcomes. 

It is not recommended that these programs are implemented. They have not been included on the Evidence Portal.

How do you keep the strength of evidence ratings for programs up to date if new research is published?

It is important that the information on the Portal is not static and is updated to reflect changes to the evidence.  As such, reasonable efforts will be made to ensure the program summaries and core components are reviewed regularly and updated as more evidence becomes available.

We will re-run searches on particular topics to identify newly published research and will incorporate this into the Evidence Portal over time. 

My program/intervention is not included on the Evidence Portal.  Does that mean I shouldn’t use it?

No.  We know that there are gaps in human services evidence. 

Not all programs and activities have had the benefit of an evaluation or have been included in a research study.   

It is hoped that the Evidence Portal can help to fill this gap, and build the understanding of what works for families and communities in NSW. 

What can I do to get my own locally developed program on the Evidence Portal?

In the near future, the Evidence Portal will include a category of programs called ‘emerging programs’.  This may include establishing a set of criteria for establishing whether a program is evidence-informed. 

We will provide more information about this process over time. 

Why are most of the programs from the United States of America? Why are there so few programs from Australia?

The evidence reviews we conduct to populate the Evidence Portal identify programs and activities from all over the world. 

There are, however, few high-quality evaluations of relevant programs and activities that have been conducted in Australia. 

Over time, we hope to fill this gap by identifying locally developed programs and including them in the Evidence Portal as ‘promising programs’. 

Other

I am interested in the evidence on the portal but I’m not sure how to apply it to my service delivery.  Can you help?

The Evidence Portal aims to make research evidence readily available to busy practitioners in an easy to understand format. 

In the near future, there will be resources to support organisations to design or adopt and implement new services and activities.

For help applying this evidence check out our Using evidence page or email EvidencePortal@dcj.nsw.gov.au

The research evidence on the portal is not relevant to my area of practice.  Will you conduct more evidence reviews?

We want to keep the information on the Evidence Portal relevant and useful.  

Over time, we hope to conduct more evidence reviews to expand the information on the Evidence Portal. 

If there are any specific research topics you would like to see included on the Portal, please email: EvidencePortal@dcj.nsw.gov.au


Last updated:

13 Jun 2022

Was this content useful?
We will use your rating to help improve the site.
Please don't include personal or financial information here
Please don't include personal or financial information here

We acknowledge Aboriginal people as the First Nations Peoples of NSW and pay our respects to Elders past, present, and future. 

Informed by lessons of the past, Department of Communities and Justice is improving how we work with Aboriginal people and communities. We listen and learn from the knowledge, strength and resilience of Stolen Generations Survivors, Aboriginal Elders and Aboriginal communities.

You can access our apology to the Stolen Generations.

Top Return to top of page Top