Evidence Centre seminar: July 2018

Published: October 26, 2018

What is the Integrated Data Infrastructure (IDI), and how can it help us measure the impact of Oranga Tamariki services on children and their families?

Our July seminar featured presentations about the IDI, and how can it help us measure the impact of Oranga Tamariki services on children and their families.

Anna McDowell: Overview of IDI

Anna provides an overview of what the Integrated Data Infrastructure (IDI) is and how Stats NZ manages it.

She covers a brief history of how it began, how Stats NZ keep the data safe, the kind of insights researchers are gaining to help New Zealand, and where to next.

Seminar video

Transcript

Integrated Data Infrastructure — video transcript

Anna McDowell — Senior Manager, Stats NZ:

Kia ora koutou. Thanks very much for having me, it's great to have such a good turnout and I'm of course looking forward to the second presentation more than the first!

So, we we talk a lot about the IDI both in New Zealand and internationally, so, most of this I've I've worked through a number of times but there's a couple of new slides and I'm going to make sure that I get numbers right and a couple of things, especially when I'm talking about research that Oranga Tamariki is doing.

So, here we go — let's get started with what the IDI is. So, I think really the key things to say about the IDI is that it is linked data that is at the person level, so it's as granular as we can get it, and it provides the ability to look at things longitudinally, so information across time. We have data from all across government included in there, so there's a lot of admin data there's a lot of surveys, and in particular Census and social surveys from Stats NZ and there is increasingly some NGO data in there as well. We update the IDI on a regular basis; that pattern is generally quarterly, although over the next 12 months it won't be quite as regular as normal. That's so we can get on with a bit more development work, but I can cover a bit of that at the end of the presentation. And so there is also a link at the bottom there, and we'll be able to share these slides. There is also, on our website, a full list of the data that's currently available in the IDI. And that's pretty extensive these days.

Really, this snazzy little picture just shows that the thing that is one of the most valuable things about the IDI is it gives you ability to look across people's lifetimes that the interactions they have and different outcomes they experience and be able to look at different relationships for how that works.

So, the IDI is a person level dataset so the analysis is done at that level, but what comes out of the idea is about characteristics of groups. So it's all de-identified and confidentialised and private when it comes out, and when I talk about how we keep the data safe we'll talk more about that. This is probably — and it is actually "Spine" not the "Spin-E" (that's what happens when you transfer between different systems) this is probably as technical as I'll get, but it's really nice to be able to explain to people the easiest way we can about how we put the IDI together.

So, we have a spine which are the core linking datasets for the IDI. And so, we have three datasets that make that up. There's the birth register from the Department of Internal Affairs, there is the tax number from Inland Revenue, and then there's also what we call visa information from MBIE, so that’s information about people coming in and out of the country. So, with those three does it's linked together we have, you know, greater than the population of New Zealand in our spine obviously because of people coming in and out, but that allows for us to be able to have pretty good coverage, so that we can link all the other datasets and we refer to those as nodes to the spine.

We've got two ways in which we do the linking. One is what you call deterministic, where we will do exact matches, and that's based on using unique identifiers that come from different parts of the system. So, as we all know there's not one number that links us all together in New Zealand but, for example, you have the national health indicator from the health system, or you have your IRD number for taxes and things like that, so we use those where we can.

The second way in which we do linking is what you call probabilistic, and that is where we use personal identifiers, so name, sex, date of birth, and also increasingly address information, to be able to do further linking and make more links — and clearly if we had one unique identifier we wouldn't need to go to that level. And so what we do in terms of precision around the linking with the IDI — the easiest way to classify that is when we talk about false positives, so where we've made a link and it's not a true link and so within the spine linkings — the three datasets in the top corner there — we keep that as close as we can to 1% false positives. With all the other datasets that link to the spine that’s at a 2% threshold so you can see there’s a wee bit of error that comes in and that stood as a probabilistic Winky and because of the quality of the data is a bit variable excuse me and so because the IDI is a dataset where you're looking at the characteristics of groups, and that’s our output and that's what's used, that threshold for a bit of error is acceptable. If you were using a linked dataset to do case management — so to intervene in people's lives for good our because they're not doing things that fit the rules — you would need to have a look at your methodology again and make some pretty clear decisions about where those thresholds are acceptable or not, because there's some pretty real life implications for that, so those thresholds and that level of accuracy work for the context in which the IDI is used.

So, Gail Pacheco from Auckland is one of our long-standing IDI users and when I, in a second, explain a bit of the history and you'll see that the IDI has developed very much from a labour market focus tool, you'll see why because that's where a lot of her work comes in. But really, Gail is actually great at providing us with some really handy quotes, and this one is really nice where it talks about the point of difference and the greater links and the questions that are able to be answered that previously weren't, when data was sitting and it's different various silos, so the potential value there is pretty high. And this one here is just a nice example as well, about how in New Zealand we've managed to do this we're a pretty small country and this is something that centrally — in terms government — has had a lot of backing and and we've worked really hard across the system. Because although Stats NZ puts it together and is responsible for the service around providing access, you know, we definitely can't do this without the data coming in and the suppliers and the use and the willingness of people to work with that.

So, we have a lot of interest. Internationally, tends to be more around different states or territories in other countries, but if my team was constantly working through all the requests we get internationally, we'd be pretty busy with that rather than what the core job is, so that gives you a bit of an indication. Earlier this year there was a D5 International Digital Showcase here in New Zealand, I think February. We were asked to be part of that — I think it's safe to say that we were a little bit nervous about that, because we were like, there's going to be some real whizz-bang digital things, and we kind of forget sometimes that actually the potential value and how useful this is.

So, we went along and in these photos here you can see that one there is our Government Statistician Liz MacPherson talking, I think, the delegation from Uruguay through different things. And they'd heard about us before they came along — so really keen!

And then the other is Andrea Blackburn talking to, I think, the Chief Information Officer from Canada, so we were quite blown away with the interest people had in that. And it is definitely a theme that what we've managed to achieve in New Zealand is quite world leading, and so others are definitely looking to us.

A bit of background now about the IDI and how it got started. It goes all the way back to the late 90s when Cabinet endorsed Stats New Zealand as the right place across government to be looking at data integration, when data needed to be brought together from different agencies, and so we got on and we did that. The context, and the environment, and also the leadership at Stats at the time was very much one where you would put two datasets together and you would link them and they would be sitting in there environment for the specific uses of their purpose and project that was put together for and I guess one of the first really important big things that came out of that was in 2005 when they the linking of employer and employee data was put together, and that's what we refer to as LEED. And that LEED database is really the foundation of where the IDI grew from.

So with that, people will be starting to look at labour market type questions — really quickly, there were questions about tertiary education and how that fed in and so that information was linked. Questions about what if people aren't working, what's happening, might be the benefit, there's some benefit information. And then in 2011, the then Department of Labour received funding to have migration information included and for at that point those somewhat siloed linking projects to be pulled together so that all that integrated data could be used together, because a lot of the pull for the migration information at that point was looking at, well actually, a lot of people are overseas and what's happening. And just rounding out those questions people could look at even more. And that’s really the point where the prototype for the IDI came into being. And then really quickly after that point we saw some pretty rapid change in the priorities and needs across government.

In 2012, Better Public Service measures were launched and that very clearly needed more data sharing across government — there were some big and gnarlier questions. And in 2013, under the Analysis for Outcomes funding, the IDI, it was decided the IDI was the data sharing solution and under Analysis for Outcomes we worked in partnership with the analytics and insights team at Treasury so we were the data sharing solution and they provided the capability and the know-how how to use the data and what needed to go in and things like that as we expanded.

So, worked really hard and got a lot of stuff going across that in the few years afterwards, and it's very true to say that the growth and demand continues. This year I think we've got double the amount of new projects coming into the IDI that with we had this time last year, and we’ve grown in the last few years from, you know, one or two hundred users to 750.

And so a lot of that is exponential growth that we weren't necessarily prepared for in the time scale that we've had as well, so working through that. There's some pretty clear enabling factors that we have around that rapid growth and the success that we have had and also enabling factors for why the data sharing solution would sit at Stats in the IDI. And the first, although it’s getting old and it is up for review, the Stats Act 1975 has been a good enabler and it does provide really good guidance and outlines for how safe access to data and maintaining good processes there and it meant because we had the Stats Act we already had a process in place for how access to unit record data was worked through. And the Stats Act also gives the Government Statistician a certain level of Independence which ties into you know the perception that Stats is perhaps a safe pair of hands — we're not an operational agency, our operation is making sure quality in terms of data but, you know, we're not out intervening or working hands-on with the public in the same way. So, that again can be quite helpful.

Ministerial champions are, I think, not to be underestimated in this context and so definitely the Minister of Statistics, but if you look at the time frames in which the relative growth and expansion has happened, Bill English was the Minister of Finance as we were starting on that journey and then became the Prime Minister and he was particularly enthusiastic and influential and also really hot on data getting in there so when internationally people talk to us about how we've achieved things, we do talk really clearly about, you know, having good levers at the right levels as well. And definitely, the environment around government in you know emphasis on data sharing in evidence-based policy through the likes of Better Public Services, Analysis for Outcomes, and social investment was definitely pretty key in enabling factors and with the change of government and focus more on wellbeing, child poverty and those pretty gnarly social questions that were all dealing with that emphasis hasn't really changed either, we're still growing. Social license is also really important.

I mentioned that status is seen as somewhat independent and there is a reasonable level of trust there with the public. That doesn't mean to say that we should be relaxed or take that for granted, so we work really hard at maintaining that, understanding what the pain points and thresholds are at given times and that sort of thing as well. Really important: How do we keep the data safe with the IDI? And privacy by design is something we take very seriously, and I think really important to mention that, so we are responsible for producing the IDI and we're accountable for it, and we're the trustees for you, me, everybody in the room, and in the country for this data. It is a national asset — we are not the owners. So, we take our responsibilities around the accountability and trustee aspects really seriously, but there's a difference between outright being the owner and having that role. And so the framework we use to keep the data safe is what's called the 'five safes’ and it's not one that we necessarily built from scratch ourselves, we stole it from the U.K. but have probably developed it a little bit more and differently, and have it being used in slightly different ways, and so these five elements work together as a team. And in some ways the settings will be higher or lower depending on the context.

The one that tends to be a little bit lower all the time is the Safe Data, but I'll explain why. So, the first is Safe People — we do reference checks, researchers sign declarations of secrecy and undertaking of the behaviour, and the rest. We have ability to sanction behaviour that is outside of the rules people can be banned, or even prosecuted, we have not had to do that. We have a good trust relationship that we work on with the researchers. So, safe people — we're really clear that we understand who is getting in and using the data Safe Projects is also really important.

So, the data has to be used for statistical or research purposes and there has to be a public interest element to the work that people are doing. And so all projects are assessed on a case-by-case basis — we are currently working through how we can change that process to make it more transparent and a bit more timely, because it can be a little bit cumbersome for researchers, so we've got finding a good balance there. All decisions around access are made by the government statistician or a delegated person and so actually nine times out of ten that falls to my role at the moment. And so we work through that really carefully. Safe Settings.

We ultimately supply the safe setting and the IDI is run through a virtual private network, so we take care of all the security and things. People cannot email when they're on the IDI off the terminal, they can print in there, there are rules around the places in which you access the IDI, so we have some 30 remote data labs around the country and three in our Stats New Zealand offices, and there are quite strict physical requirements for how those data labs have to be set up. Safe Data: That refers to the data itself, so when the researchers are in data labs working within the Virtual Private Network they are using de-identified data so that’s data where names, days of date of birth, and exact addresses have been removed and any unique identifier has been encrypted. So that data is not confidential at that point, but there's a there's a very good reason for why that is: One, the safe data works alongside the other four safes to ensure the right environment. Two, researchers need the ability to be able to really get in and use the data to get the best insights they can. And three, safe data works really closely with safe outputs where before any outputs are released from the Virtual Private Network or the data lab environment, the confidentiality rules — which we circulate, and they're all available on our website — need to be applied to the data and we make checks to make sure that they've been applied appropriately.

So, before any data leaves our environment it is confidentialised to protect privacy. I won't talk through all the points on this except for to say, I think the key point is if people are wanting to use the IDI, the best thing to do is come and have a chat to us before you put an application in.

There are a few things that would be a red light and they're things you'd expect around case management or some pretty complex legislative issues, meaning that the data can't be shared or used in that way, you know, research for regulatory purposes — things like that. There are a number of things that might fall into the sort of 'you need to check this with us' category and we come up, you know, we have those conversations frequently and work that through with researchers and, you know, you've clearly got your green light category. So, it's really, really uncommon that a project is just straight-out red and it won’t move through. These conversations that we have with people to help understand, you know, why the rules are that way, it’s not bureaucratic for the sake of it, there's really good protections in place for really solid reasons. I mentioned social license earlier and it's really important for us what the public thinks.

There's a couple of pieces of research that we have had done, one around the public's attitudes to data integration that we contracted Opus International to do. And that showed quite clearly there were some clear settings and processes and values that were really important to people and the IDI and the processes fit nicely into that, at different levels of the spectrum, but we were largely in step. And then some further work that was done in conjunction with the University of Auckland and funded by the Data Futures Partnership actually, showed that yes that's right and people actually have a baseline level of expectation that data will be used in different ways, and that that should be used for good, but the further step with this one was that most of the public still don't have good visibility of the IDI and you know there's many reasons for that, but that’s really good for us to understand more about that, more about what our messaging needs to look like, how we put things out there — and so that's constantly something that we're working on and checking social licence is something that we need to do regularly as well, you don't kind of just check it once and go 'okay we're good.' Because context changes, different things are happening in the environment and the rest of it.

So, this is a definition that the second piece of work I talked about came up with around social licence, so basically we're saying here that, you know, there are different norms. Norms will dictate what's sort of okay in terms of prohibited types of actions and that’s where your social license comes from. In terms of the way the IDI is being used, I've got just a few examples in here that we could probably talk for half an hour about this, and then I wouldn't be the best person to do it either, it would be better to get the people up who actually do the research — which is why I'm looking forward to the second presentation — nonetheless, I'll give you a bit of an overview and really that slide's just saying there’s lots of things. And I think the key point in this slide, because there's lots of scientific methodological words in there, is that key uses of the IDI at the moment are not driving new methodological ways or statistical ways of doing things, but because the data is sort of fundamentally for the first time being pulled together in one place, people are able to apply the statistical and analytical practices they've been using for a while in quite a different setting and to really hone their skills and answer questions they hadn't been able to before.

And, so there's there's really three major ways in which the IDI is currently being used: Evaluations and quasi experiments so, you know, as opposed to a true experiment where you might have an intervention and a control and that's allocated in a very non-random way, the quasi-experiment works through how you can nut that out and propensity score matching is quite often used, micro-simulations are increasingly being used in the IDI and we have a number now and Oranga Tamariki is one of those, so you guys have got a pretty hefty one going, and there are more coming. And then profiling of sub-populations, which is funny because it sounds intense the way that's written but that's actually your basic statistical work around counts and averages and things like that. And I think probably I would add that also that there is research into statistics that goes on in the IDI as well and obviously Stats New Zealand does a lot about that and that's about how might we be able to use admin data more than surveys and what are the different ways that we can do that.

And so the key message there really is that having the data in one place like this allows people to be able to test the methods that they've been using with different types of research questions.

So this example, outcomes of tertiary study that Ministry of Education and TEC work on is probably an oldie but a goodie, I think is the best way to describe it, and so this is a really nice example of where your integrated data can be used in real life, and so they look at the outcomes of tertiary education and on the Careers New Zealand website there's the ability to be able to compare different degrees, and have a look at what you might be earning one, two, and five years out from study, whether you might be in the country, how likely you are to be employed, and things.

The Social Investment Agency have focussed on investing for social wellbeing and in this example they were looking at the impact of social housing on people’s lives, and you can see that there was some good positive impact for that.

They were originally looking at a full return on investment but the quality of the data that was available didn't really allow for that, but nonetheless you can see there's less time in prison, more time in school, and able to access better support. This is a really nice example, very current, looking at exploring the gender pay gap and, in particular, this piece of research was looking at the impact on wages for women.

It showed that the main finding was that hourly wage difference was 5.7% between similar men and women without children, but 12.5% between men and women who were parents with high income women being most affected. And so, what's really cool about this also is that the researchers teamed up with the Ministry for Women and the Ministry for Women have produced a booklet that provides practical advice for employers about how they can use the benefits of this research to think about that in their own work setting, and so the researchers have been working in conjunction with them, so Gail Pacheco, who we talked about earlier, goes out and talks about the research and the Ministry for Women go through with her and talk about the booklet. So, that's been a really nice example.

This one is a little awkward for me to talk about because this is Oranga Tamariki's work, so there'll be plenty of people in the room that probably know much more about it than me, but a really nice example here. And so, the big curve represents the real numbers of children that the data is used to measure whether they are in need or not, so the blue in "not apparently in need" moving through to the orange where there's a high level of need, and Oranga Tamariki is probably intervening in some way. The five measures at the bottom is the domains of wellbeing, is how this is measured. And I think it's really important to say here too, and we often have this on the slide, is that the IDI was used in this, you know, as an important part — not the only part — but in part in the establishment of Oranga Tamariki as well, and that was off the back of some work that the analytics and insights team at Treasury had done, looking at the key risk factors for children at high risk of poor outcomes.

NGOs and iwi: So, I mentioned we've started to get some NGO data coming into the IDI, and the IDI has, due to the complexity of the data and the way it's developed, a pretty high baseline threshold for analytical expertise to get in and be able to use it. And that's one of the realities of where it's at right now. And, clearly that is a barrier for different groups, and so we need to do some more work with iwi and NGO groups, but also with our partners across government, to be able to decrease some of those barriers. And so that's not a nut we've cracked at this stage, but are working on it. And I think the other important thing to mention there is that we are looking at better ways to be able to make sure that there are cultural perspectives included, and how access to data is considered, so I talked about the 'five safes' that we use, and we are working with various people and groups to be able to provide a cultural lens across the 'five safes,' rather than adding in a cultural safe, but actually embedding those values and principles within our access protocols as well, so we're really excited about that.

What have we learned? Benefits and limitations. This is just some and I think that probably in the blue box that it would also be good to mention that it's pretty difficult to get the IDI out these days as well, and it's not as timely as it could be, it can take a while get new data in and so we have a development project in play where we're looking to address some of it as well.

And, if I've got time I'll whip through a couple of slides on that. There's some pretty clear lessons we've learned; you've got to be really aware of your enabling factors and grab the opportunities when they come, and that's really important — "if you build it, they will come." And that's true! So, in hindsight, we probably weren't as prepared for a lot of that as we should've been or could've been and that's meant that some of our processes and policies probably haven't evolved at the same rate as use, or we perhaps didn't anticipate some of the maturing demands on what people would want to do with the data, so we are working on that now. Data providers are generally willing to make their data available. Understandably, there are a lot of questions that come up and we need to work through that with people, and we need to be really transparent about what our processes are — what the protections are we have, and how we work through a privacy impact assessment, and how that is done before data goes anywhere near our system, and things like that. So, we've learned a lot about how you work through that. Technical skills are a barrier to extracting value, and that I've touched on, and so we will continue to work on that.

We had some early issues with system access and capacity I wouldn't say they've necessarily all gone away, but we understand what they are a lot better now. You must always, always check in on your social license and not do things in a vacuum and really the key goal for us is to move from a naive sense of trust, to a more informed sense of trust. We feel we’ll have a much more stable foundation if we can move to that, and so that’s something we're constantly working on as well. Flexibility is key, I mean that probably, yeah, you've got 750 people and some of that is, you know, entire teams working to budget cycles and things relying on you you need to work some of that out.

I mentioned that we're developing at the moment, and that's really imaginatively called IDI 2 and so because of the way the IDI has taken off and grown, and I often liken it to the fact that we’ve just kept building more stories onto our skyscraper, and where we'r at now is that we actually need more of a city infrastructure and some arterial routes and to be able to be more sustainable and scalable and responsive, and so we've been out and we've talked to customers and that list there gives you some idea of the things that are still difficult and tricky for people, so it’s not all about technology, it's also about our need to be able to interact and have a better network. I need, like, more metadata and I need the metadata really close to the data, some really practical things as well, I need to be able to work iteratively and at pace and sometimes your processes just really get in the way of that, how can we find a good balance, and a data issue as opposed to just an IDI issue that the quality of data isn't always well understood, or it makes it difficult to work with, or I'm not sure if I'm putting the right assumptions out here, and things like that. But because of the nature of the IDI, those things become really apparent really quickly.

So, we are looking to redesign, and we've started at the high level with across government and wider working group and we've done that based on workshops and then sort of sprints in between. And some of the emerging direction is, well not some — the emerging direction is needing to ensure that we're stable and flexible on our platform to enable growth and the two sort of key value streams that will be a focus will be enabling the research and insights to be done better, so the people who are already in there and making their life better, and more focus and more deliberate work around partnerships.

So I think, you know, we've been running to keep up so much probably missed some opportunities for where we could do that more, and that's both with people using the data, people wanting to use the data, people producing different products off the data, and the like. So, the idea is that by making life better for the people who are in there already, then the value will flow a bit wider again so that for the people who it's actually really a high barrier to even get in there, there'll be more coming out in the two-way flow, we’ll hopefully be able to work more. So, where we're getting to now is we're going to be working more with — we have a technical lead group — and working more with them around detail, what it is that's the requirements really for what you need on a daily basis when you're going in and using this and that sort of thing, and then making decisions about how you update the IDI in future going through with that.

So, sorry I had to rush at the end a bit there, but I think — I don’t know —  we probably don't got time for questions now, we'll to it at the end... Yeah, cool.

End of transcript.

Davina Jones and Paul Arts: Evidence used to evaluate Family Start

Davina and Paul talk about what we know about Family Start to date and the different types of evidence used to evaluate it.

They also discuss the tangible impact linked data is having on the Family Start programme.

Transcript

Evidence used to evaluate Family Start - video transcript

Davina Jones - Manager of the Evaluation Team, Oranga Tamariki Evidence Centre:

Tēnā koutou katoa

He mihi nui, he mihi mahana kia koutou katoa

Ko Davina Jones tōku ingoa

I whānau mai ahau i Leicester i Ingarangi, kei kororio

e noho ana, kei Oranga Tamariki

o e mahi ana, kia ora hui, hui ma tātou

Welcome everybody.

I'd like to welcome my co-presenter today, Paul Arts, who's the business owner for Family Start within Oranga Tamariki I'd like to welcome our partners from Allen and Clarke who have recently won the contract for our current evaluation of Family Start and we'll be working closely in partnership with the evidence centre on the current evaluation and they have put together a multidisciplinary team for this work I would also like to welcome the Family Start providers themselves who have made it to our presentation today thank you for being here.

So capitalising on big data.

As Alice said we're really lucky in New Zealand that we've got a data set as rich as the IDI. It is the envy of many other countries. And at the same time I'd like to make a call for first principles that we start with what are the key questions that we want to answer the key research questions the key evaluation questions rather than jumping too quickly to the method or the data set.

So yes, we're very excited about capitalising on the IDI the possibilities that it allows that were not there a few years ago. And at the same time we've got other questions that we want to know. We want to know not only is Family Start effective, but how is it effective and why is it effective or not as the case may be.

So quick rundown of what we're going to talk about today. We're going to talk about -- Paul first of all will give us a rundown on the history of Family Start and what the programme actually is. We'll then reflect together on the evaluation history of the programme and what we already know so far which brings us to the current juncture and the evaluation rationale for now which is understanding about what works and are we making a difference with this programme for tamariki and whānau and actually adding in to those insights rich understandings from the ground.

So over to you, Paul. 

Paul Arts - Pa Harakeke team leader, Oranga Tamariki:

I'm just going to give a bit of an overview and an insight to what the Family Start programme is I'll do our process from three angles. Firstly, who are the families that are accessing the Family Start programme, a bit of an overview on that. What the service is. So how the family experience the Family Start programme. And thirdly, I've got a little video clip that I wanted to share with you.

So who accesses the Family Start programme?

Well it is a targeted programme. So families that are expecting a baby or have an infant under 12 months of age can come on to the programme and then they can stay on the programme until the child turns five.

The families that come on -- the programme is designed to support and assist families who are facing significant barriers to be able to provide that nurturing and caring environment for their families. 

So the sorts of issues or obstacles that families are facing are things like mental health difficulties family violence, addiction problems, housing insecurity. So these are the things that families are facing that are impacting on their ability to care for their baby or infant.

So then what does the service look like for the families?

Well families will receive intensive in-house home visiting support. Weekly home visits, fortnightly home visits. So a fairly intensive service. Together the Family Start worker and the family will put the baby right at the centre. The baby, the infant, the toddler and look at everything that's impacting on the welfare and wellbeing of that infant. 

So essentially anything that's in the environment that's inhibiting healthy child development is potentially in scope for the Family Start programme. Central to that is what the family are doing what the parents, the caregivers are doing, so how are they interacting, how are they providing, how are they creating that nurturing environment that all of the evidence and science shows a baby needs to really develop optimally. So core to the programme is specialist information and advice on parenting and child development.

So what I'll do now is I'll have a short video.

So this one is part of a kitty of resources that are available to support the front line I think it's quite nice because it will give you a bit of an idea and a bit of a feeling for the programme and what it is and you'll see spread through it as well the sorts of resources and tools and prompts that the front line staff use. 

(Video starts playing. It shows an animation of a carer visiting at family at home. Soft music plays)

Woman:

Before I try to develop an effective plan with whānau I work on building trust and establishing a positive relationship with them. Setting goals is much easier then. I talk about what I've seen during my time with the whānau, particularly describing the child's behaviour. This keeps things real.

Any plan should relate to the strengths and needs assessment of the whānau. The plan should look at practical ways to meet any outstanding needs or risks for pēpē. While some of the goals in the plan might need to be focused on issues like stopping family violence getting help with addiction problems or finding a house I always make sure to include a goal purely focused on parenting I talk with parents about things they can do with baby every day that is fun and free. Such as talking, singing or reading.

(Resource booklets show on screen)

I check out whānau says and Nga Toka whānau sections in the whakatipu booklets. They're full of ideas about how parents can help.

(Animation of the carer using a website)

Let's have a look at the age appropriate child development tiles in the parenting resource. These are designed to help us understand what's going on for baby and how parents and whānau can help I love using the ara matua parenting pathways too. They can help us to set goals that have pēpē at the centre.

I encourage whānau to choose things that they are interested in and include these in the plan. If I am struggling with whānau who are reluctant to work on a plan I talk to them about their dreams or moemoeā for their children.

(Shows a resource booklet)

The Thinking About Parenting booklet from SKIP can be a good way to start this kōrero. I encourage them to share their thoughts about what kind of people they hope their children will be at five, ten or 30 years old. I ask how would they like their child to describe their childhood when they grow up. Most parents are motivated by wanting a better life for their children. 

Reviewing the plan that every visit is a good idea. It gives me a chance to check how things are going and celebrate the whānau successes even if they're small. Most important it's to keep in mind ngā tohu whānau. Six things kids need to grow up into happy and capable adults. If our plan addresses these things then we're on the right track.

(Video ends)

Davina Jones - Manager of the Evaluation Team, Oranga Tamariki Evidence Centre:

So as you can see from the slide, Family Start was developed in 1998. Previously there had been Early Start in the Christchurch area which started in 1995 and Family Start followed after that with three pilot sites and in the next couple of years it ramped up to a further 13 sites.

In the mid-2000s it was expanded to a further 14 teritorial local authorities. In 2011-12 the programme was refocused more towards the high needs end of the spectrum with families facing more complex challenges being targeted for the programme. Over that course of time there were a number of reviews and evaluations that were carried out as well.

So by 2015 there had been a number of studies and those previous evaluations were very useful in terms of honing the programme, how it operated on the ground, improving it and we certainly know that it was lined up with the international literature on best practice for home visiting programmes. But what we didn't know was whether it was effective or not.

So the previous evaluations were not able to establish the effectiveness. So the Ministry of Social Development at that point commissioned a quazi-experimental impact study and this was a collaboration between the Ministry of Social Development, AUT the University of Auckland and George Washington University in the States to carry out that piece of work to fill that gap.

The study used newly available de-identified linked data and ethics approval was sought for the study both for the linking of that data and for the Family Start study itself.

The key method was individual propensity score matching that Anna mentioned previously. So basically comparing outcomes for children who received Family Start born 2009 to 2011 with a comparison group of children who didn't receive family start who had similar characteristics Also there was an area level study that was conducted alongside this that looked at outcomes for all high needs children in the areas that newly got Family Start in the mid-2000s.

And one of the things that facilitated this greater choice of methodology as in the 2016 study was that since 2008 the FS Net data capture system was implemented. So with the individual level study we had 3,291 children in those territorial local authorities where Family Start was available who enrolled in Family Start and they were compared to matched controls in areas where Family Start was not available

The matched controls had the same ethnicity, benefit status, birth year and a number of other characteristics, which meant that the impact of Family Start was assessed as the difference in outcome between the two groups.

The most striking finding was around mortality. That there were decreases in mortality, there was evidence especially in the sudden unexplained deaths. These decreases in mortality were found through the results overall and also for both Māori and Pacific children. The impact size was largest for Māori children.

The mortality decreases were found for Māori children both with by Māori for Māori providers and also with mainstream providers. And a number of tests were carried out by the researchers to try and check that those results were not reflecting other efforts going on in those communities at the same time.

The researchers took these results to indicate improvements in the children's home environment and level of care and we hypothesized that the mechanisms could include the input of those Family Start workers who were encouraging families and delivering education around protective factors such as safe sleeping practices, reduced tobacco in the household and breastfeeding.

There were also positive results in the health and education domains which suggested increased service engagement due to Family Start.

So there was an increase in those who were full immunised at age one and up to age two. There was increased ECE attendance at age four and there was increased maternal use of mental health services in the first year post birth, that mothers were getting the help that they needed.

However, we did find a concerning decrease in PHO enrolment at age one but interestingly there was no difference by age two.

So again this starts to raise questions for us. We wondered what might be going on. We wondered whether in some instances the Family Start providers and the allied services that were being linked to were taken by families to be providing that medical advice rather than them actually going to the doctor

An interesting finding was that immunisation and PHO enrolment for Māori children was found with by Māori for Māori providers but not with mainstream providers and that's something we'd certainly like to learn more about.

And again, it raises questions so we were asking ourselves, "Does this reflect improved coordination of the services?"

Where the same organisation maybe has the Family Start contract and also has some Well Child Tamariki Ora services going on alongside, we found that there was increased contact with Child, Youth and Family as it was then, prior to Oranga Tamariki and that increased contact was for children -- the Family Start children compared to the matched controls.

We knew that one of the referral pathways in to the programme was CYF referral, so there could be some circularity in the findings there but the researchers conducted some sensitivity testing and removed those children from the analysis who had had previous contact with Child, Youth and Family and the result still held.

So again this raises more questions for us really. The findings highlight the difficulty of using administrative data to try to measure whether maltreatment of children is reduced which is obviously a key question for us here in Oranga Tamariki. And the questions we had around that was whether Family Start just brought forward contact with CYF that would eventually occur in any case. Whether increased early contact with Child, Youth and Family was in fact preventative. Was it working to reduce harm in the longer term and we did a study with a longer follow up, to follow children through to a later stage of their childhood to really be able to answer that question. 

And another reflection we had was does Family Start encourage families to seek hospital care so that not seeing a reduction in injury, did this really occur as a result of Family Start is what's going on there.

A safeguarding effect in terms of contact with Child, Youth and Family is something that has been seen in international studies. For example, the Nurse Family Partnership in the United States where there are eyes in the house where families are being -- there's an eye out being kept for families along with those home visits and the relationship that's being formed.

So again, these require further research to fully unpack and understand what's going on in the mechanisms I'd now like to handover to Paul for him to explain what sort of utilities this sort of study had for a service design and operational team.

Paul Arts - Pa Harakeke team leader, Oranga Tamariki:

An interesting academic exercise without a doubt. Fascinating results

But what I just wanted to point to as the business owner is real life and tangible impact that this link data analysis is having on the Family Start programme. And we've got many examples, I've chosen just one or two in each of these areas.

Firstly I just wanted to make the point as well, if you are the business owner for a programme you know the programme intimately, you commission a piece of evaluation, you might hazard a guess of what the conclusions of the evaluation are going to tell you, right?

Well except when that evaluation includes some link data analysis I can tell you from experience you should be prepared to be surprised and that was certainly the jump out lesson from our link data work so far that Davina's spoken to.

Importantly it informs investment decisions. So firstly just those findings of positive impacts from the quazi-experimental evaluation gave the Government confidence to invest more in the programme.

It also informs other investment decisions and I've talked about the early treasury link data work that identified risk factors for poor outcomes well earlier on when that was first released we used that link data exercise to make decisions about where we would invest.

So that gave us information about where the vulnerable children were, so how much to invest in Hamilton vis a vis Invercargill. Those sorts of investment decisions link data has been crucial to making those.

Also within those bigger investment decisions are things like -- you just heard from Davina that we found that the treated group had increased rates of SE participation And on the back of that finding we were comfortable to increase our investment.

And a particular part of Family Start, which is this subsidy to families to meet the costs ECE. So on the back of that finding we actually increased and rolled out that subsidy that was still in pilot phase nationally. So those are the sorts of tangible investment decisions that the analysis has informed service design, so a number of service design questions.

This one here, internationally in other jurisdictions there's examples of similar programmes that are targeted particularly for first time parents and the question was should we or shouldn’t we be targeting first time parents only and sub group analysis found that the positive findings held true for subsequent parents as well as first time parents So that's a fundamental service design question that was answered for us.

It guides strategic relationships.

We heard Davina talking about the reductions in SUDI rates at a statistically significant level. That's really guided us being much more purposeful in our engagement with the national SUDI and safe sleeping strategies.

The front line, we've got some Family Start providers here today and for years the people working are saying "Well we know we're making a difference." "We can see the difference we're making hand on heart."

But it's been really motivating and validating for our service providers to actually have that black and white impact evidence.

Core business intelligence. One of the questions we had as a business was are we reaching the right clients? Who are the families that we're serving?

Anecdotally we know. The link data analysis provided us this picture of the Family Start families. I won't go through it because you can all read it.

This is core business information. Who is your client group, who are you reaching? And it was through the power of link data that was able to provide us this sort of information.

Davina Jones - Manager of the Evaluation Team, Oranga Tamariki Evidence Centre:

So what do we still need to know about Family Start. 

Obviously we still want to know is it making a difference in its current form. We want to look at outcomes over a longer period of childhood, particularly around that finding of increased contact with child protection agencies, CYF as it was, OT now, we want to understand over a longer period of time over a longer period of childhood what do those outcomes look like as a result of receiving this programme.

We also want to test the effects of expansion and whether the positive findings in a new study would still hold for the newly served areas. We want to unpack that impact if it occurs. To understand how and why it occurs. What do parents and caregivers think about the effectiveness of Family Start? How do they experience the programme? How can providers be supported to maximise their chance of making a difference?

And we'd like to develop more in-depth and holistic understanding about how the programme works, particularly for whānau Māori.

So you might be asking, okay, the 2016 study that was only a couple of years ago, why do we need another evaluation for now?

I think it's really important to remember that the children in the previous study were born before 2012. Since then a number of things have changed. As I pointed out previously there's now tighter targeting towards the more vulnerable end of the spectrum with families with a greater number of challenges. There's now a new education resource that has replaced the previous curriculum. It's an online parenting resource. In the earlier video you saw an example of how it was used in the home with the family and now we have national coverage. 

So all these are new dimensions and with it being a large spend we need to keep making sure that it's working. So the current evaluation has three key purposes. One is accountability for the $7 million per year. We want to be confident that it's a good spend that's making a difference.

There's learning goals as well. We want to learn what's working for families. We want to learn from different cultural perspectives what works and how that works. We want to value those world views and make sure that those knowledge bases are brought in to this evaluation and we want to be able to internally with our operational teams to make sure that there's that continuous improvement to the programme and holistically all these branches together add up to the evaluation really having a key utility not only for the service design and operational people, but to support our leadership team in making decisions about where to invest and how to invest.

The previous Child, Youth and Family had a much narrower remit on the statutory end of the spectrum than the current Oranga Tamariki Where we have responsibilities not only for those in statutory care but those experiencing multiple and complex needs who do not require care at the statutory level, who maybe need intensive interventions or early interventions. 

And the outputs of the evaluation can help input not only to those decisions, but to the development and the honing of home visiting services and practice within New Zealand.

So we're using mixed methods to develop a rich picture on this evaluation. A quantitative stream with a quazi-experiemental design using the IDI. We're at the early stages of setting up our partnership with Allen and Clarke and with the multidisciplinary team. So we're just developing the plan now.

We've got a qualitative stream with case studies, journals and looking at implementation on the ground. We're incorporating a te ao Māori worldview into this project and a Pacifica worldview.

And just going back to the te ao Māori worldview, forthcoming shortly in the ANZEA journal evaluation matters is a paper on the He Awa Whiria approach the braided rivers approach which is a retrospective look at the 2016 study to incorporate our te ao Māori worldview and have a look how that study can be interpreted and understood using that frame.

On this evaluation our ambition is to embed that approach into the evaluation from the outset across all streams and to make it part of the fabric of the evaluation in both the qualitative and quantitative streams. And where we hope to get to is holistic synthesised conclusions where the evidence streams bounce off each other. There's co-construction of meaning and insights we braid the streams from the different knowledge bases and we create the space for dialog and new understandings to arise.

And on that note I'm just going to finish up with the voices of some of our families who have received the programme.

(A video plays. It shows animations of families with text on the screen. Soft music plays.)

Woman one:

I didn't know the right from the wrong, the discipline, how to teach your children, the lot. Because I've never had that in my life, I never knew there was a better way to do it, a better way to approach it.

Woman two:

Now I'm definately more confident with my baby. And working with my whānau worker Nicole weekly, I'm able to enjoy life. I have made some new friends and increased my social network. And when Nicole comes out to visit me, we read books and play games with Nirvana.

Woman three:

I learned about the wellbeing of my daughter, about the effect of domestic violence on the family, about my daughter's growth and development, about the effects on the growing brain.

Woman four:

Before starting Family Start, I thought that the kids just grow on their own and stay at home. But i realised that kids need to go outside of the house and socialise with other children.

End of transcript.

Our next seminar is Friday 28 September 2018. Details are still being finalised, but if you would like to be included on the mailing list, please email research@ot.govt.nz