LibrePlanet: Conference/2019/Transcripts/australia-s-decryption-law-and-free-software

From LibrePlanet
< LibrePlanet:Conference‎ | 2019
Revision as of 18:30, 1 June 2019 by Pdurbin (talk | contribs) (add transcript for Australia's decryption law and free software)
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)
Jump to: navigation, search

Australia's decryption law and free software

Amie Stepanovich, Danny O'Brien, Isabela Bagueros, Ladar Levison


https://media.libreplanet.org/u/libreplanet/m/australia-s-decryption-law-and-free-software/


Richard Stallman: Australia's terrifying decryption law. Well, people mostly think about what that law does to ISP's and people who pass messages around. Well, that's bad, of course. But, I see a potential that there is a danger as well, to free software projects, that Australia may order people to sabotage their free software projects.

The question is then, "what can we do to defend ourselves from this, to make ourselves safe from those orders, make our projects safe. Either stopping them from applying or, perhaps, watching each other closely if someone goes to Australia. I don't know exactly what the problems are like, let alone what the solutions should be, but I hope that we'll analyze the problems and find solutions.


Danny O'Brien: K, thanks Richard. It paints a rather dark picture and I'm afraid that almost everybody here on this panel will just be painting in more of the details of that dark picture, but with maybe some bright lights far into the future, with your help. So, I think the first thing we should probably do is introduce ourselves, and then for those of you who don't know the details of the 'Access and Assistance' bill that passed last year in Australia, perhaps we can just delve a bit into what it actually looks like, and then we'll talk a little bit about the ramifications. So, my name's Danny O'Brien, I'm the International Director at the Electronic Frontier Foundation. So, we've been watching this and bills like this all around the world for about three or four years now.


Amy Stepanovich: My name's Amy Stepanovich. I am, the US Policy Manager and Global Policy Counsel for an organization called Access Now, which is an international organization working at the intersection of Human Rights and technology. I have been working on issues around encryption for ten years now, and have worked on the issue in Australia ever since the Prime Minister said that the laws of math don't apply in Australia, which was several Prime Ministers ago and quite some time.


Isabela Bagueros: My name is Isabella, I'm [the] Tor Project Executive Director. I was actually here 2 hours ago in this same room, giving a talk. I'm gonna be talking about mostly, I mea, We are a project that's relying heavily on encryption and so I'm gonna talk about as an open source project what we are thinking and also we have developers in Australia so how we are thinking about this whole thing, and we also have some questions.


Danny O'Brien: So I think Amy, you volunteered to break the bad news to everybody, yeah, about the structure of it. Oh, I can do setup, actually, so, I think one of the challenges with both the reaction to bills like this and also people's perceptions is that we're either always winning or we've completely lost. I think the first time people really realized about how damaging this bill would be was when we'd already lost. The time between it being proposed and successfully voted in was really a matter of weeks really rather than even months. That meant that people could only really react to something that seemed to be a fait acompli. The truth is however the winds of change that have ben building up to this bill being passed have been blowing for quite some time, really since post the Snowden revelations, and what the Australian bill is part of is a movement which in some ways started off being at least relatively positive in what we could have anticipated it doing, but ended up being pretty terrible, and that was that Edward Snowden described a set of practices by the intelligence services, the United States, and the UK, and pretty much all of uh five eyes which are the sort of anglo countries that cooperate in their intelligence operations, and he described a set of behaviors and projects that I think probably came as very little surprise to many of the people in this room, but had never really been described to the general public or really specified in statute, in law. They were just part of the black operations that the intelligence community had been conducting with almost no oversight since nine eleven and even before then.

So, in many countries a couple of things happened. One is that in the United States and in the United Kingdom, those intelligence services were facing not only public outroar but also court challenges. The summary of the court challenges would be, you don't have the power, the legal power, to conduct this kind of mass surveilance either on a constitutional or statutory basis in the United States and similar kind of arguments in the United Kingdom. So whats the response to that? Well one of the responses could be just to stop doing that. The other one is, well, we're law makers and politicians, let's pass a law that makes this legal.

And, I say that there's a positive element to that because of course that gives you an opportunity to at least curtail and limit what those processes are, and you know, to a very small extent that did happen. There is more oversight in general, there are more limits on some of those programs, but those programs themselves have been legitimized, and also, you know, while you're doing a mass surveilance bill, you might as well throw in all the stuff that you want for the next ten or twenty years, and so the Australian and the UK laws contained a bundle of, a wishlist as it were, of the intelligence services, of what they really wanted to tackle the twenty first century challenges of a bunch of mass surveilance conductors and at the same time what we dealt with was a bunch of other parts of government saying, who looked at the Snowden revelations and went 'Goodness, we didn't even know this was possible, we'd like this power for ourselves!' and this extended across law enforcement from customs and excise and internal revenue, almost every part of enforcement in government perceived these powers as something that would be incredibly useful for dealing with, you know, when parents send their children to schools in the wrong catchment area, or when theres been some violation of the correct amount of chicken feed distributed in farms, so everybody sits there and goes like 'Mass surveilance would work really well.'.

So there was a very broad widening of the people who had these powers, and finally I think the other part which started in the UK and really came to a head in the Australian setting was how do we insert backdoors in a technical environment where the capability of offering a communication service isn't merely controlled by a few big companies or systems that are very strongly connected to government. How do we tackle the fact that almost anybody can write an encrypted messaging system? And that the number of popular encrypted messaging systems and other secure forms of communication are proliferating, and I'll pass over to Amy to explain exactly how they are trying to do that.


Amie Stepanovich: So real quick, are any of you from Australia or like think that you're experts in all things, anything Australia, Australia law? Anybody? Awesome, I'll just make all this stuff up. (laughs) If you're really interested in the bill, probably my finest peice of policy writing of all time, I wrote in response to the Australia bill, it went up last month, it's "The Ten Things Taylor Swift Can Teach the Australian Parliament"... Not kidding it really is a fine peice of policy writing. It tells you kind of the big things to be worried about, about the law that passed in December. I'm going to go through a few of those right now.

As Danny said, this law passed very quickly, we had known that they were talking about it for about a year, and that they were, the Australian government was very interested in this topic, but it looked like they were kind of going to just study it on the side and they might let it go. A coworker of mine and I testified before one of the committees, and it was looking actually quite promising, and then they dropped the text of this bill, and I read a lot of laws so when I tell you this is the worst law I have ever had to read in my life it is saying something and that is including the UK's Investigatory Powers Act that had passed only shortly prior.

I wrote down some notes about what to cover. So, it creates three authorities and because people in policy love acronyms, I'm going to use acronyms but real quick it's the Technical Assistance Request (request being the operative word, permissive), Technical Assistance Notices (these are not going to be permissive they will be mandatory requirements) ,and Technical Capability Notices (TCN's). Snd Technical Capability Notices are supposed to be... the thing that we're most worried about essentially. The orders that require changes in the way technology is built, not necessarily tied to any specific investigation or any specific incident, just they have to be in pursuit of ultimately helping either national security or serious crime. Serious crime is anything with a sentence over 3 years, so, a lot. And, all three authorities can be used to help a foreign government, no limitations on what foreign government, you're probably sitting in one of them, to also deal with serious crime, again anything that government sentences with three years and above, any government around the world at Australia's discression.

It applies to basically anybody. If you have a technology or a company and it touches australia in any way, you are subject to this law. You are not allowed to say there is a transparency peice, you are not allowed to say that they haven't asked you to do anything. If you say that they haven't come to you yet, you are technically violating the transparency peices of this law, they are so bad, the gag orders are so broad. It can be used by any number of different Australian national security or law enforcement entities and that power can be delegated downward. So it's granted to kind of the top officials in each agency but those people can delegate downward who can excersize this authority, and who is allowed to go to the provider and ask. THe things that they can ask you to do are known as...[starts laughing] there are times when I describe this law and I can't, it's just so absurd. They can ask you to do 'Acts or Things' is the technical term in the law. The list of Acts or Things is two pages long, it basically covers anything so this isn't an anti-encryption law, I also describe it as a censorship law, as a surveilance law, basically they can ask a tool to build in any capability that they may want to in pursuit of national security or serious crime. The TAN's the Technical Assistance Notices at least have to be tied to a specific event, unlike the TCN's. And the Technical Capability Notices have a few more restrictions in place, including that they cannot be used to insert a systemic vulnerability.

A systemic vulnerability, just real quick and then I'll turn it over, a systemic vulnerability is a vulnerability that affects an entire class of technology. It is very unclear what that means [answers a question from Richard Stallman with a 'that's next']. The systemic vulnerability affects an entire class of technologies and that may mean not only like a phone, or all iphones or all android phones, but maybe all phones. And if all phones are not implicated, then this wouldn't come into play.


Isabela Bagueros: To get to a little bit what we've been thinking right, like Tor is an organization that's producing free opensource and we do have a developer who is not a volunteer who is a direct connection with [the] organization so someone who has had direct connection with a project that they can come and serve one of these letters.

I'm going to start with good practices that we have in place already and I recommend any opensource software to have. Like, for instance, code review, who writes the code is one person, who reviews is another person, who merge[s] is another person so three people [are] going through this process. We are also starting to think about adding a second reviewer in that process. By itself it's not enough. We also have a bunch of tasks that we are, good practices writing code, right, like unit tests CI, all these kinds of things that we are also trying to do test for catching weird things in the code.

Also we work a lot with researchers so if it's not our people looking at the code there's are people [a] diverse set of universities looking at the code as well. So, the goal here, what is the goal here it's how many people you can have watching your code to help you watch some change or something like that right, like and increase that capability. We created bug bounty programs [to] incentivize people to look at the code as well.

And finally one of the things that can still happen is they can force a person to have a build of Tor that is different from the build that we normally give to people to download and stuff. So [it] is extremely important to have reproducible builds on that front, so you can like help people in that type of situation to check and make sure that they're not getting a modified version.

We also have our social contract where we have like our no backdoor policy kind of like screaming to the world that like we're gonna fight this. So going back to fight, right, like we are trying to explore like our lawyers, our first thought was let's explore possible laws in the United States or in the European Union that we would, if we are served this and we had to do this we would be breaking those laws. So the GDPR is one of these that we are thinking about when we're thinking about the type of argument as if we modified something [in] Tor, you cannot modify targeting the person that committed the crime that you want to investigate, that affects everybody, affects everybody's data, therefore GDPR maybe I don't know, I'm not a lawyer, but this is like high-level talks that we are having.

Also the argument that she said, the last one, it's almost impossible to make a modification that will affect only one person, it will affect everybody. So that's another thing.

We also have lawyers, we have our head of person[elle] is a lawyer. We have three lawyers on our board. So, those letters are a little slight different from the national security letter, as far as I can understand. [With] the national security letter in the United States you cannot even talk with a lawyer. But someone in Australia who received the other letter can talk with a lawyer. So, that person could reach out to one of our board members like as their personal lawyer and have that conversation, so it's kind of like a legal way for us to know what's going on in some way with a proxy, I don't know.

But like all these things to Tor means, in general there are different things that can happen with this person, or the organization if you don't comply you can pay a fee. People can go all the way to a hundred twenty five thousand, organizations can go all the way to ten million or whatever, like [in] the millions. If the gag orders of those letters, if you somehow do it wrongly like if the person talks with a board member who is a lawyer and they let us know, and they're like 'No this is wrong', they can get five years in jail just for doing something wrong related to that.

Another thing is if they don't comply, on the top of those five years, they can get another ten, so total around 15 years for someone in that type of situation.

Also, another thing I'm going to throw out there, I haven't made [up] my mind around this but it's like on the policy side, of trying to educate the policy makers of like legal hacking as a way to avoid requesting backdoors. Like if you have a target person, attack that person, don't ask everybody to put [in] a backdoor to investigate a crime or whatever. This is something that at least from my point of view it's something new, it's a [new concept] that I'm trying to understand better, that type of argument.

But like I said, we are research[ing] everything like all the possibilities. We're not the ones with those policies, we're not going to be the ones arguing that, we don't normally do that, we write software, but, thinking of all the possible ways that we can educate people on solving the problems that they want to solve with these laws.


Danny O'Brien: So, um, I'll try and present a sort of brighter view of this. We're giggling a little bit but we're actually like sort of laughing because of the ridiculousness of this bill, and we're less sanguine about the ramifications, but I see, and I think a lot of people see this as a moment where free software can shine because, if you feel that this is something that sounds incredibly opressive and damaging to the free software movement and developers within it, it's much worse for the proprietary software vendors right, because essentially now they have no ability now to guarantee that their software or hardware does anything that they promised to their consumer consumers, right.

So what's happening right now, and we know that this is the first step in what the Australian and UK governments is doing, is a feature called the ghost. Has anybody, hands up anybody who's heard. Okay, great, some more good news for you. So, the ghost is the proposal for these powers which also exist in the UK, aimed at WhatsApp, Apple's iMessage, and other proprietary secure messaging tools. And the idea is is that well we don't want to create a systemic vulnerability, we just want to access bad people's communications. So what we're going to do is we're going to ask you to change the user interface of your software so that when you're on a whatsapp one to one communication, we can add another person to that chat and make it a group chat, but the UI won't show that. Similarly iMessage has a private key that (input from audience member about the UK government)... so the ghost is the UK government. The UK government is your friend now [laughs].

In the case of iMessage and Apple devices, Apple devices use their secure communications by each of your devices having a private key, and, Apple knows, has the public key to each of those devices, so when they send a message, they encrypt it to all of your keys. Well, all that the UK government is asking people to do is add, secretly, another key, to that list of devices, and that's the ghost, and that key belongs to her majesty's government.

Of course there's no way that those companies can really adequately claim either that they've not done this, or prevent a government from coming and knocking on their door and kind of compelling them do it if they're in that country. Free software, the proof is in the code. If people are compiling the code, then they can at least see whether it's changed. If you have deterministic builds, reproducible builds, you can get some protection there. And you can at least begin to make some assurances to your users, even if you can't tell them that you're not complying or complying, the code speaks for itself and there's no way you can not reveal that.

And of course there are a million ways in which people can submit obfuscated code and hide these sort of practices. But free software does have this opportunity to say, we are ultimately the only software that you can trust in an externally validating way. But the downside of that is, is that you're now on the front lines. So when we talk about, when Richard talks about, you know, what can we do about this well there's not much that most people can do, but there's a lot that free software developers can do.

Now, I work at the EFF, I am not a lawyer, and this is not legal advice, and I know that our lawyers would freak out a little bit if I was to say this, but what I anticipate is defiance, honestly. What I anticipate the next stage of this will be, they'll attempt, successfully or not, depending on how hard the big companies fight, they'll put in these sort of systems into proprietary code, and then when people move to free software, good people or bad people, then they will come knocking, to talk to free software products. And then, this is a test of your meddle, ultimately. This is a test of what happens next. I don't think any of us could ask anyone who is within the reach of the Australian government to make a stand in this kind of way.

But I think it behooves all of us, and you know, this is the things that we as organizations spend a lot of time thinking about as you can see, how actually, when that kind of situation happens, people can stand up for their freedom. I would say that the one thing that these governments don't want to get into is some sort of incredibly public fight with a group of very sympathetic figures, where the real results of these laws are going to be played out in their most damaging and obvious way. Which means that almost certainly that situation where you get to make a giant stand is not going to happen, and when it does happen, I think there's a chance of winning.

What will happen is a slow drip drip drip of pressure in other ways. And one of those ways will be not necessarily to discredit free software. But it suddenly means that governments who, for the last ten or so years have actually been in many places relatively sympathetic to the ideas of free and open software, they've funded it, lawmakers have encouraged it in adoption, and they've seen it as a bulwark against large tech companies or the extraction of ridiculous costs for the public sector.

That may shift. Alright, what will happen here is that, and we've seen this in other places, that people who create software outside of these systems that can be controlled will find themselves described as outliers, will be described as people who are actively kind of trying to undermine the regulatory environment that brings us so many benefits. And I think that's going to be an extremely testing time, right, because, the arguments for these tools, are going to be arguments that the people in this room even are going to be very sympathetic for. I'm sitting having discussions with ISP's in New Zealand who have setup censorship systems to remove the horrific video and manifesto of the gunman in New Zealand. It's very hard in that setting to point out that they're building a surveilance and censorship system that will be misused, and that this may in fact be the first step in that misuse. And you're going to find yourself in that situation as well, you're going to find yourself in a situation where people are going 'look, bad people are using free software, and maybe you made a mistake in your freedoms that you set out by allowing that to happen, and maybe you should be a bit more socially responsible to that'. And the important thing in that fight is to stick to your values, stick to your freedoms, and realize that you're going to have to make a stand, even when that's an unpopular stand.



Amie Stepanovich: We wanted to save the last fifteen minutes for questions but I wanted to take, if I could, the next sixty seconds to give some, as a lawyer but also not giving legal advice, a few policy things that you can do now.

When the law passed it had a large enough minority of people saying they would only pass it if it was subject to change this year, that they are re-analyzing the law. So there is a time still to engage on it, if you want to file comments, talk to somebody. In fact I recommend you talk to a lawyer very early, and you don't get cute too cute with this law. When I say that the transparency and gag provisions are big, just do, not legal advice, listen to Danny, but also know that what Danny just said is actually not allowed under the law. Specifically you are not allowed to counsel somebody to not comply with the law, as a term of the law. So just know what you could be subjecting people to if they're within the reach of the Australian government. Know what it's reach is, engage on it now, and engage on other laws that create conflict of laws with Australia. GDPR could. We're talking about data privacy in the United States, for the first time basically ever. That law could really help, and if you get involved in creating this conflict-of-laws system, you actually could create a system that the Australia law isn't able to stand any more.


Moderator: Please come down to this microphone to give your questions, although, if you cannot come down I'll try to bring a mic up to you but it's better to come down.


Audience Member 1: Hi, can you better define 'within reach of the Australian government' like as a US citizin, when would this law apply to me?


Amie Stepanovich: So they definitely can get to anybody in Australia. So if you or your entity or organization or company has somebody there, absolutely. Now the law applies to you regardless, if you reach, if you have one customer in Australia, it applies to you. Can they get to you, is another question, that's if you have somebody there. But no they are one of the five members of the five eyes along with the US, Canada, the UK, and New Zealand and they might have some sort of, I don't know what they could do in those five countries either, they're very close.


Audience Member 1 Followup: So do you have to be a business then.... or, somebody in Australia downloads my free software


Danny O'Brien: Yeah, yeah, and importantly you don't have to be a company. If there's one thing you want to read in this bill, it's the first table in it, which gives a list of people that they feel can be compelled to do this, and it's very specific in that it says actual persons rather than companies. I think this would be much easier thing to fight if it was just targeting companies.


Audience Member 1 Followup 2: And then, real quick, within the US, asking somebody to implement those changes would be compelled speech, which is kinda, so how does that fit into, like if Australia actually tried to enforce that law against, say, me, like how would that work legally within the US?


Danny O'Brien: So, uh, it would be possible to fight it, but when there are disagreements between two different countries about what is the law, there's not an obvious way to resolve that. I think what it would possibly mean is any attempt to enforce this bill through the US legal system would have some serious problems.


Amie Stepanovich: But they're trying to apply their jurisdiction to you, so they wouldn't be going through the US legal system they're actually going through the Australia reaching into the US, which is a slightly different structure.


Richard Stallman: But how could they put you in prison just on the say-so of an Australian court, they'd have to extradite you. So they'd have to go to the US legal system, so the question, or one question is, what can you do to make that extradition as difficult as possible for them, and make sure that it will blow up in a big public stink. And what can you do to arrange that they can't extradite you?


Danny O'Brien: Wait, Richard, come to the us, come to EFF. I think it would be very [unintelligible]


Richard Stallman: But do you have to do that right now? Do you do that in advance, or only if it happens?


Danny O'Brien: Only if it happens, but we definitely want to think about it beforehand. It's almost my dream case. Right?


Richard Stallman: If they say 'somebody downloaded your software and is running it in Australia' can you say, 'I don't know if that's true, so I don't have to listen to you.'?


Danny O'Brien: No


Richard Stallman: What's to stop you?


Danny O'Brien: Well you could say it, but

Amie Stepanovich: If they can get to your program, if they can go on the internet in Australia and download your program, that would be enough.


Richard Stallman: Really, they claim that that's the only basis they need?

Amie Stepanovich: mm hmm (agreement)

Richard Stallman: Really. So what is the Ladar Levison way to defy this. I mean, suppose you work for a company, and they tell you to do this, and you quit your job, what happens?


Amie Stepanovich: I think you're fine. I mean, [I'm] not giving legal advice. The company is still subject.


Richard Stallman: Yeah, but you could quit.

Amie Stepanovich: You could quit.

Richard Stallman: And then maybe you could go to a reporter, or...? What would happen? How could they...

Amie Stepanovich: You are still subject to the gag provisions, which are extensive. So, you could go to a reporter but you would probably be breaking the law.

Richard Stallman: But you'd be breaking an Australian law...


Danny O'Brien: This falls under the category of being cute. Right? Like all of these things that you're going through are like, doesn't actually change the ultimate result. The thing that you pointed out, Richard, which is, never go to Australia again, is probably, in some ways, the most clear way, bearing in mind I'm now advising people and counseling people, which in itself is a crime under the law, which means I'm not taking a holiday to Australia soon. [irrelevant exchange with audience member]. But, the other thing I point out, and it's not just me being patriotic, is the UK has very similar laws like this, very similar provisions that are just written more loosely. They didn't spell out that they could do this, they just said 'we can compell you to do anything and if you do it, we can declare it's not against the law'. So, they can actually get you to do things which might be illegal under the GDPR or under or under british law. So, Richard you want one more?

Richard Stallman: Let me see if there was any more, most of it's been answered. Yeah I guess it's been mostly answered.

Danny O'Brien: Good, that was our goal, that was our stretch goal was to answer RMS's questions.

Audience Member 2: Hello. Thank you. I just have a question, just kind of playing this out in my mind if this happened in the US, there would be a lot of avenues to kind of walk it back. Right? I imagine a lot of people would be upset. There would be corporations, who, you know, as much as we don't really trust them in general, would still very much be on our side for this. So, I'm wondering, you know, they would take it up to the supreme court, there's lots of organizations that would oppose this, hopefully. The question I have is what avenues exist in Australia for, similarly, fighting this, and are they being exploited as much as possible?

Amie Stepanovich: So we're [looks to danny, 'march?'] three months into the law, and they're revisiting it. So, nothing that I know of yet, that I know of, not that I would know. And if I did know I couldn't tell you anyway [smiles]. That said, there are a couple things, there's one thing I think is really important to bring up here is that Australia has an amazingly intelligent group of civil society and tech experts and advocates that are working tirelessly to argue against this law, and the reason they are working tirelessly is because they are volunteers most of them, they are unpaid, and people don't donate to Australian civil society organizations. They're not well funded, so these people are doing it in their spare time. If you do have resources to throw at this, fund Australian Civil Society. They're really out on the front lines trying so hard to engage to get this walked back. That said, there will be court challenges I think. I'm already connecting people with lawyers, becuase theyre being asked to be connected with lawyers. Again, I have no idea why. And that is the thing, connect with your lawyer early and often, before you even, not with the EFF, but before you receive one of these notices, know what you can do in the case that one of your employees does because the legal counsel, you can only talk to a lawyer within the law, you can't talk to anybody else. So know who you're going to.


Danny O'Brien: And, the other thing I point out is that one of the challenges with this is that it's quite hard to build up a case if you don't have a particular event that has caused it. And barring the gag orders, what that often means is that the worst cases that we're painting here don't happen because they don't have to happen. Because these people, companies and individuals will comply with very small things that are sufficient in order to avoid the bigger issues. So you get a very broad set of powers and that just has a chilling effect like no Australian venture capitalist will...a venture capitalist would think twice about funding a secure messaging service built in Australia, and that's great for the intelligence services. Just that chilling effect is enough.


Audience Member 2: Thank you.


Audience Member 3: I am a lawyer but not in this jurisdiction. Fifty four years ago I was putting myself through Columbia law school as a senior programmer. My question has to do with the gag orders which I haven't read about and don't fully understand, and their effect on the GPL. If I write a computer program and I am doing messaging work, which I have never done, and I put something in that's a backdoor, and I label it 'This is a backdoor', I presume that would be a violation of the Australian statute.

[nods of agreement from Danny O'Brien and Amie Stepanovich]


Audience Member 3: Whoops. Supposing I put it in my software in plain PL1 or some other clausey English language, and in fact it is a backdoor, even though I don't call it a backdoor. Does the fact that I published my code constitute a violation of the Australian legislation?


Danny O'Brien: So you're sort of saying, I've written some code, I've been compelled to put in a backdoor, I write a backdoor but I write it so obviously that it's almost like saying, 'Look this is the backdoor.'?

Audience Member 3: Yup


Isabela Bagueros: [unintelligible]...I think there is a peice of this law which is about the change and implementation that you're making needs to be secret. So what you're saying like when you force it to do something and you make it so obvious for others to look for, it's actually a good thing because the change of the code and the implementation are trying to understand that peice better of the law. Like if I do make the backdoor in the code, it needs to be completely secret. Nobody can know that this is happening. So if you make it like so screaming in the code, that this is doing what it's doing, it's actually like a way to go, like, I don't know, to hack the law or something, or like, yeah.


Audience Member 3: If I had a program that was published under the GPL, I'm required to distribute source, that will typically include comments. Those comments will typically include change information. I could put comments in without change information. And that would be waving a biig red flag.


Danny O'Brien: So in practical terms the way that this is going to work is that there's going to be a negotiation. Right? They're going to come to you and say 'We would like this put in the software' and then you're going to do something and they're going to say 'No, that's not what we want'. And in fact that's pretty much what we're seeing happening with facebook and whatsapp right now, is that they haven't built in this ghost because they're engaged in a discussion with the government.

Now the government is waving this big hammer at them, but it's very hard to understand what happens when that hammer falls. It's just the threat of that hammer, and that's what you will get into in this situation. It will be really about a constant to and fro.


Richard Stallman: What he just said seems to me to a brilliant point, which is, if you're talking about free software, and you're supposed to put in changes, well you have to publish them as source code, cause no one will accept them if you try not to. So what it may mean is that there's no way you could possibly, there's nothing you could possibly do that would satisfy the law, or the demand.


Danny O'Brien: And, this is why I mean it's free software's finest hour, and also it's why I encourage people to think about building software or systems like Tor, where it's very difficult to see how someone could get you to do the things that they want you to do.

Richard Stallman: Just becuase it's free software, if you tried to do what they're ordering you to do, it wouldn't be secret. So it's impossible, it's an impossible demand, impossible on it's face. Maybe that means they're defeated somehow.


Amie Stepanovich: There are a lot of 'If I do this, will I be okay questions'. I've noticed a trend. And so coming back to it, if you're gonna try and be cute, talk to a lawyer first. Please, like this is the one thing I want to leave you with. Don't get yourself or if you have people in Australia, don't get them in trouble becuase you've decided that you are smarter than the Australian government. Because even though they think that they are better than the laws of math, and they're not, they are still very smart people, and they will come after you, so just don't do that to yourselves.


Audience Member 4: So I guess my question is, trying to be cute, trying to be cute. So basically my question is as a developer for free software projects. I can have, like my really important peice of code, there has to be like two or three independent implementations before they will merge the changes, and in this way, basically the Australian government would have to gag order everybody on the project before they can compromise the security. I guess, maybe that will work?


Danny O'Brien: If it did, we wouldn't tell you. It's not necessarily the letter of the law, it's the effects that they're trying to acheive here. And so it may that we can craft a situation, and what Richards says is I do see free software as a very powerful bulwark for freedom in this situation. But there will be other ways that the system will be attacked. So, we should bang through these.


Audience Member 5: Do we have any more time? Okay, so if I understand correctly, they're restraining our speech even if we're not being asked to comply by making a change to our software, you said that we still can't basically do a warrant canary type thing and say I haven't been told to make changes, so there's obviously a first amendment issue here and my question is.

Amie Stepanovich: Australia doesn't have a first amendment


Audience Member 5 continued: Yes but as a US citizen in the US who isn't planning to go to Australia any time soon now, Australia's great, so that's sad. There's a first amendment component to this and so my question is, I mean, if I judge it as very unlikely that Australia is ever going to come after me because my software isn't something they're going to care about, why shouldn't I just put up the canary anyway and say I'm taking a calculated risk and hope that many other people will also take the same calculated risk and put up similar warrant canaries, or not warrant canaries but Australian law canaries and say the government of Australia has not told me to change my software. I'm taking a risk, but it seems like a very small risk.


Amie Stepanovich: There's a problem with canaries, the problem with canaries in general is that they apply to little things as well as big things, so just be wary of the problem with canaries and the problems that have come up. I will not tell you what risks to take.


Audience Member 5 continued: Okay I understand.


Danny O'Brien: I think the other thing is just think about it statistically, you're probably unlikely to be the person that will be targeted, so. And this is our final question I think.


Audience Member 6: So this is a question from the would-be other panelist through RFC, he's got two questions. Will they strongarm tourists with this law. That's the first question. And the second question is, last month the Australian law enforcement said that they've already started using the law. Do you have any details on whom or in what form that was?


Amie Stepanovich: Some of the agencies put some information in their submissions about the revisit of the law, they have used teeny tiny portions that basically clarified that they can do teeny tiny things, and as far as we know they haven't been doing the big bad stuff yet, but they are preparing to, and they're in talks to get some agreements in place some of these Technical Assistance Requests, the permissive ones, which are the scary ones because these are basically legal immunity for all of the really bad companies that want to do bad things anyway. So those are the ways they're being used and about to be used. On the first question which was...

Audience Member 6: Will they strongarm tourists?


Amie Stepanovich: I assume so.


Danny O'Brien: You could certainly do that, again you don't want a big fight, you don't want to target someone if there is not an easier more traditional way of doing it, but governments do target people who are just termporarily in their jurisdiction.