Changemakers are the architects of transformation, turning challenges into their life's calling. They passionately seek solutions to create a meaningful impact on the world. In our interview series, Human Rights Changemakers, we share the stories of individuals and organisations actively shaping human rights, championing marginalised groups and fighting fearlessly for equality.
In October 2023, more than four-and-a-half years after the Christchurch mosque attack, a coronial inquest began. The inquiry brings the shootings into the public sphere once again, with the aim of addressing unanswered questions and contributing to the healing of those impacted by the trauma of the massacre that claimed 51 lives. It unfolds amidst a backdrop of rising fear and hate threats against Muslim and Jewish communities, influenced by the atrocities in Gaza. As we approach nearly five years post-Christchurch, the lingering question endures: How much has changed?
The Lovepost spoke with Allyn 'Aliya’ Danzeisen, community leader, advocate, lawyer and educator, about the Christchurch terrorist attacks, social media, and the role of government and tech companies in paving a safer way forward. Danzeisen is the national coordinator for the Islamic Women’s Council Of New Zealand (IWCNZ) and founder and leader of the Women's Organisation of the Waikato Muslim Association (WOWMA), a youth programme for Muslim women in the Waikato. Years before the 2019 Christchurch attack, Danzeisen, along with other strong female advocates from the Muslim community, had been seeking to engage with politicians to address concerns about the safety of the Muslim community, harassment and bullying of Muslim children in schools, the Muslim community’s inability to access services to which they are entitled, and the negative portrayal of Muslim people in the media. Danzeisen found it a slow and largely impossible task to be heard, and secure action. In the process, she experienced first-hand the way Muslim people are discounted, commenting that had she not worn a hijab, her concerns would have been heard.
Perseverance is bringing about incremental change. Of the current coronial inquest, Danzeisen says, “[I]t is important to know the truth—to know what can be done in the future to make sure it doesn't happen again.”
Kia ora Aliya, and thank you for the incredible mahi (work) that you have done. I want to start by looking at the Christchurch mosque attacks from a human rights perspective, and the radicalisation of extremist views on digital platforms. How can we learn from it effectively? If I'm talking with friends or family and I hear misinformation, how do I hold the conversations to address that?
Well, there are a lot of aspects that we need to do. We need to be proactive. Our household is the first place that we need to be proactive, and if you think about the Christchurch terrorist alone, we know he spent huge amounts of time on social media. Basically, his community was only on social media. So we know that his evolution or change came from that environment. What we also know is, because he spent a lot of time online, that his family were aware of it and didn't know what to do.
The amount of time that someone spends on social media is very important. We need to make sure that they have communities—not just virtual but real-life communities—and are spending time in those communities. So if they're online 24/7, it's a problem because they are not acting in real life, interacting with people in their daily lives, and therefore their view is only from that tech perspective. What we have with technology is that they pitch certain areas based on your clicks, your likes, your network; and as a result, you may be pushed or pulled or invited to areas and therefore your connections become very isolated. Whereas if you're interacting in society, in real life, and have communities, you will be navigating a range of issues and a range of thoughts, and it will cause you to reflect and think deeper into issues.
The first thing is making sure that you yourself have a community outside of the computer and outside of your phone, and that you are interacting with that community and contributing to that community, because your views will be challenged but also perhaps bolstered in a positive way. For example, if you think about the ‘can do’ Kiwi attitude that evolved in real life, it didn't evolve virtually. It was from people discovering together how to fix a bike or how to construct a building. Those real-life activities are the first aspect.
The second is to question. And even question yourself on “Is this a reality?” And I'll use an example of [Donald] Trump. I'm an American. I was born and raised there. But you'll get people who will say that he was being persecuted with all the cases and they pointed to the [2020] election and said he should have won it. And I said . . . he had 65 cases in front of juries. There were juries judging the evidence, and some of them were judges. There were people reviewing not one, not two—I can get one or two being biased or being wrong, but how do you get 65 different people from different places going, “He's lying?” It gave them pause for the first time. This was an in-life conversation that I had when I was back in the States, and no one had ever said that to them because their network didn't pitch that. It just showed me that they hadn't questioned that evidence, they only had been listening to the headlines that were being kicked to them through social media and through those conversations. We've got to get away from the headlines, taglines and actually go deeper into it.
And we ourselves need to be reflecting and asking questions. And go, okay what questions would I be asking if I was standing on the other side of the fence? Those are skills that need to be taught in school, real life, in-person. You need somebody outside of that realm to raise the questions, and you yourself need to be questioning. Higher-level thinking; deeper, inquisitive thoughts, and going, “What if? What if I'm wrong on this topic?” That's what reflection requires. A lot of people aren't taking the time to do it.
Our media need to look at it through this lens too. We're so pushed on the quantitative data, rather than looking at the qualitative, the quality of thought or the quality of life.
We're just looking at numbers: “How many clicks can I get?” Versus, “Was that really a good story that had an impact or saved one life?” If one story saved one life, I would say that was more impactful than if it got 100 clicks. But we're calculating data on the quantitative rather than the qualitative. We're dealing with the on-the-ground, smaller issues right now. But we really need to be able to take time and think about those bigger ones in our aspirations in this space.
Regarding tech, we need to challenge our governments and we need to challenge our tech companies to do better. When we say better, it takes courage to do the right thing sometimes. It takes courage to step back and go, “Shareholders, we're going to do the right thing. And we're going to have to pay for more moderators because we need to make sure that this is a safe product that we're producing. So for the next two years, our product, we are not going to give you the same return that we've offered in the past. But because our community and your community are not fully safe, we're going to invest now so that we have a highly profitable company in five years’ time.” People will flock to safety. But nobody's investing in that safe environment right now. I would point maybe to Bluesky that some people are trying to invest in . . . they're trying to invest in a safe environment. But right now . . . if we're talking about the big ones, they've got people behind them expecting profits.
So right now, safety issues are not of concern to these companies?
The first cuts that Elon Musk made [to Twitter] were to all of the safety and moderation people. Those were the first places that he cut for profit, [to] generally save money. It's creating a Wild West in the tech environment. When we've talked to Facebook, in the past, [through the IWCNZ] that I'm the head of, they literally had one office in Singapore to do the moderation for the South. The whole Global South. We’re thinking that [they have a] huge amount of people, [but] they don't have that investment. Ninety percent of the moderation was for the US, and the rest of the world had 10 percent. They didn't have the people who could even handle the language, let alone moderate and understand the context.
In the past, in one of my lives, I clerked for a judge. I saw one lawyer 'homeschool’ a jury so he was able to talk to the jury without the out-of-town lawyer understanding what he was fully saying. We can speak in language that, if you're not from the area, you won't understand really what I'm saying even though you understand the words. You have to understand context, which means you've got to hire moderators from the areas where the content's coming [from], and the tech companies haven't invested in that. That costs money. But when you earn over a billion dollars, I'd say you could take 100 million and invest that in moderation to make a very safe product, and you’re still returning 90 percent of that back to your shareholders. And you've created a safe product. But they haven't been willing to give up even that 10 percent, and that's a shame.
When companies have produced unsafe products before, what has motivated companies to do the right thing when they knew? For example, if we take asbestos or things like that that were causing physical harm for years, they were making profits and they didn't care that they were killing people. So it's not surprising that companies do this for profit. But what's surprising is the unwillingness of governments, including New Zealand's government, to step in stronger on this and to go, “Nope, sorry, we're going to regulate that you have a safe product, safe services.”
I have to applaud [former Prime Minister Jacinda Ardern for making the Christchurch Call. A call is asking for people to do the right thing. So she asked companies publicly calling upon everyone to do what's better. But we need to take that call and actually make some of it regulatory because not everybody is acting in line with that call. I would say that [with] the big tech companies, some of them have tried to do some good things with it. But they haven't gone far enough, in my opinion.
What else do you think governments need to be doing?
I think that there needs to be a duty for content of pull-downs. So, if you know that this is a dangerous statement, or it's against the law even, they have a duty to take it down or suffer consequences. Not just asking [companies] to do the right thing, but actually making them do the right thing. I'll give an example again in my own personal experience. When a threat came in, the police contacted the company [but] the company wouldn't give the details. I had to take on a big global company to find out who [the threat was coming from], because the police couldn't figure it out and [the company] wouldn't tell the police. What individual has the money or resources [to do that]? A government, on the other hand, [does]. There are huge costs to the government, and there was a big cost to me. So the company is benefiting from the profit but isn't doing the right thing. So, in that case, there should be an automatic report of who it is.
[There are companies] contributing to harm inside the country, so why aren't we taxing them? They come in and they're here, and they're getting profit from us but we don't tax them. I get taxed on what I purchase from overseas but the companies themselves aren't paying taxes. Why? Why, when they're causing harm to society? I mean, there are so many issues here. But they could step in to [establish] a statutory duty of care [when] it's agreed it's harmful. It has to be a quick turnaround, though, so you've got to invest in somebody to monitor that as well to go, “Hey, this has to be taken down”. And if you don't, this is the consequence—there has to be some bite in that legislation. For example, you can sue them for this, emotional harm or whatever. And then if found guilty, this is the minimum penalty that the person pays. That's an example of something, consequences.
I would say that Europe is doing far more than the United States, Canada, New Zealand and Australia. There are good proposals coming out of Australia, but they haven't implemented them yet. Australia is further ahead than New Zealand in this area. States are putting in place some of the harmful hate legislation and they're talking about the content and right now, there’s a case going on in Australia against Twitter. That might be a landmark case, we'll see.
So you have a duty to take it down. And if you don't, then any harm that results from it, whether it's emotional or physical, you're responsible. That would motivate the companies.
The social media space, without adequate moderation, can cause great harm and your work seeks to hold companies accountable. Will that be enough?
The other thing that people don't understand in this space—people talk about free speech, free speech, free speech, but they don't talk about a primal right before you get to the right to free speech. There's a primal right to be safe and secure and alive. The right to life, security and safety; that's a primal right. If you are dead, you don't have any rights. All your rights have been taken from you, including the freedom of speech. So there's a nuanced discussion of all of those rights. For example, we've utilised social media to augment our own voices, so we understand and we're not saying stop social media. What we're saying is [that] we want a safe environment to have quality dialogue.
We know the Christchurch terrorist was online almost full time, except for the times he travelled to so many countries to connect with his people who were online. So we know he was created from that, and we know that he killed, and we know the impact as a country. We know the billions of dollars that had to be spent because of one individual's act. So if we're going to not get in front of it, there's going to be ongoing harm.
I'll give an example that people aren't talking about. The Buffalo shooter was 15 when Christchurch happened. The Uvalde shooter was 15 when the Christchurch attacks happened. I can give you so many terrorist events [where] social media was the central thing. If we don't get in front of it, there are going to be more and more, and it's going to keep spreading. So we have to navigate and we have to trial some things. Nobody's willing to step in and try. I won't say nobody—Europe's trying, but at times it can probably be too harsh. Just trial to see, okay, where is that line? Because right now, no line has been drawn.
So the primal right to safety should come before the right to free speech, but it doesn’t. I hear people saying, “I have a right to freedom of speech and I can say what I want.” Social media can be so unsafe—so dangerous. How do we get people to place safety first, before freedom of speech?
That's where we have to be having those discussions. We have to have those community conversations. I would love to step into that space but there's risk even for myself to go into some places to have the community conversation. Online, it won't get to the people. So we've got to have a network of people who can have this conversation.
The other thing is [that] some communities are so [far beyond their] capacity [there isn’t] time to have those conversations. But we have to have them. You’ll see those of us at IWCNZ just pushing through as much as we can, but we're navigating terrorism content, we're navigating youth issues. I could list 25 things [and] each one of them on its own is a challenge, and we have to juggle them daily. I'm dealing with high suicide rates. But we do have to augment and give opportunities to voices. Governments could help with having those conversations.
I'll express disappointment in this last year, where the hate speech legislation was put out there for a second and then removed. Only for religion, and then removed anyway. They didn't allow for a good conversation, they didn't even allow the bill to go through, to have the conversation because they didn't want to have it in an election year. But they also didn't want to have it the year before—so they just don't want to have that conversation. But it has to be had, and it can't just be had with the minority groups that are being targeted. It has to be a conversation led by the majority, going, for example, “Okay, one person has caused billions of dollars of damage to New Zealand from this.” Is anyone having that conversation? At the political level? No.
We've had to change our legislation because of his act, and that took a lot of cost. So it would be nice if the government would step strongly into that space but they probably won't. And part of it is—this is my suspicion but I can't confirm it—politicians also rely on social media to get their message out, and they're afraid of social media companies. And when you've got a company that countries are afraid of, or politicians that are afraid of the retributions of companies, that is not a good global dynamic.
You've been advocating for years with the Islamic Women's Council of New Zealand. How do you continue to hold hope? How do you keep going in the face of so much that you've had to endure?
Well, first of all, I literally wake up happy every morning. I might not be happy by 7 am, but every day I wake up happy and that's just the way I was raised. But how I have hope? Take a look at Whina Cooper, take a look at Titewhai Harawira, and what they were doing in the 70s to [begin] what is now a movement of language, a movement of culture and pride. It's incremental and it's individual, so I take hope in them. But . . . we have stories of David versus Goliath; those stories we've heard and that aren't just stories: they give hope. There are examples of it happening.
I would say that we, the Islamic Women's Council, were the canary in the mineshaft. What we were telling people, tweeting virtually [as well as] from our own voices, [is that] we’re at risk. We were saying, “We are at risk. The nation is at risk. New Zealand is at risk. Something's going on.” Christchurch was a bit of a wake-up call, but it was towards the Muslim community, and there was a shift in how people interacted with the Muslim community.
But you will see everything that IWCNZ has said is that we don't want this to happen to any community. What's sad, but what people are recognising, [is that] the risk to our community, [wasn’t seen] as a risk to other communities like the disabled community, or any of the vulnerable communities like Māori who are targeted now in a big way. But if you look at 2014–2015, up to the attacks, we were the target. And we were saying there's a problem, but it's like the quote by [pastor Martin Niemöller]:
First they came for the Communists
And I did not speak out
Because I was not a Communist
Then they came for the Socialists
And I did not speak out
Because I was not a Socialist
Then they came for the trade unionists
And I did not speak out
Because I was not a trade unionist
Then they came for the Jews
And I did not speak out
Because I was not a Jew
Then they came for me
And there was no one left
It's kind of like that in this space. It has spread. And everybody has to unify and realise the risk if we don't.
In New Zealand, they revoked most of the community education that was in existence. It was around 2008, 2010 when they pulled that funding. There used to be all these community programs that were pulled. We can't do that. We've got to invest in that as well. Now, where could we invest? Maybe some taxation of social media companies, [which we] can then invest in community programming. Then we have a strong kid that can get out of the computer, make good decisions, live a happy life and use that tool in a powerful way to advance society. So it's all investing.
Those possibilities must be key to holding hope. And for those of us who may not be part of the Muslim community but also want to be able to support or advocate, how can we also be part of the discussions?
You know how people are kind of gloomy right now? In the 1970s—I was young—but they were like this. And I remember worrying in the middle of the night, waking up and worrying about society. But I also have seen a stabilisation in my life. I'm old enough to have seen it. So I think we can get in front of it. But it can't just be Aliya. It can't just be the Byron Clarks online. We can, as a society, do it together but there has to be some investment. There has to be government, they have to step into this space. Tech companies have to step up their game and shareholders have to take a little bit less profit to ensure that. Because at some point, otherwise, it will come back in a negative way to them.
Thank you so much, Aliya. I so appreciate your kōrero (discussion) and your time for this.