Assistance and Access Bill 2018

Reading time ~17 minutes

UPDATE: An updated blog post on the passed bill is available here

The recently released exposure draft of the Assistance and Access Bill 2018 [1] redefines the future of government interception of electronic communication. Left unchanged it will have far reaching consequences for the security and privacy of Australian’s. The legislation is both long and complicated; it raises a number of questions and concerns, which so far have not been adequately addressed. The following is a look at the legislation from the perspective of a techie; I am not a lawyer. My analysis is based on viewing the legislation as a technical document, looking for gaps and inconsistencies, since that is so often where the greatest threat lies. My opinion is that the greatest threat stems not from the compulsory notices, but the voluntary requests, which have greater scope and less oversight.

Overview

I’ll warn you in advance that this is a long blog post, primarily because it looks at a long piece of legislation, Schedule 1, which is the subject of this post, is 68 pages, with the corresponding explanatory note a further 53 pages.

At a very high level, the legislation introduces two compulsory notices, and one voluntary request. Whilst the compulsory notices have gained the most attention, it is my view that the voluntary assistance requests are where the greatest danger exists. The assistance requests are not constrained by the same limitations as the notices in what they can ask for, neither are they part of the annual reporting. They appear to offer the greatest capability with the least oversight. Continue reading for a more detailed look at Schedule 1.

Legislative Objective

The explanatory note [2] released in conjunction with the exposure draft lays out the case for needing the bill, which is largely based on the claim that encryption is preventing effective intelligence gathering. This is demonstrated in statements like:

“…encrypted devices and applications are eroding the ability of our law enforcement and security agencies to access the intelligible data necessary to conduct investigations and gather evidence.”

The explanatory note claims that the bill aims to provide a way to gain assistance from organisations to access data despite the existence of encryption, whilst claiming to not undermine cyber security. Whilst the first few pages repeatedly mention national security agencies or terrorism, the powers defined in the bill can be applied in far broader contexts, of particular concern is in “protecting the public revenue”. That is not to say that public revenue should not be protected, but intercepting private communication is a very serious invasion of privacy and should only be considered in the most serious of criminal and national security cases.

Not only is its application to the protection of public revenue concerning, but the definition of public revenue in the explanatory note is extremely broad:

“The concept of ‘public revenue’ includes State and Territory revenue in addition to Commonwealth revenue. Lawful obligations charged on a regular basis such as taxes, levies, rates and royalties are also included but occasional charges, such as fines, are not. ‘Protecting the public revenue’ also includes the activities of agencies and bodies undertaken to ensure that those lawful obligations are met; for example routine collection, audits, investigatory and debt recovery actions.

The term ‘revenue’ is not limited to incoming monies from taxation but could also extend to ‘monies which belong to the Crown, or monies to which the Crown has a right, or monies which are due to the Crown’.”

If this legislation had existed before the Centrelink debt recovery program would they have deployed interception as a tool in that process? Should that be considered a reasonable and measured response, or would it be an excessive use of power? What protections are there to stop such an abuse of power taking place in the future?

There are undoubtedly times when interception is necessary for national security, and there is always going to need to be a balance between privacy, transparency, and security. However, all three of those concepts are important, it is not that security trumps all other values, to do so will lead to the formation of a surveillance state, and ultimately a police state. We do not live in a risk-free world, with freedom comes the risk that some will abuse those very freedoms to cause harm and attack the open and free society we aspire to be. Whilst we should endeavour to prevent such attacks, we cannot sacrifice the very freedoms that are under threat, to do so would hand a victory to the attackers. It would be a perfect example of winning the battle but losing the war. Such approaches are becoming more common across the western world, as a vacuum of good leadership leads to the pursuit of objectives instead of the protection of values.

Is anyone not a communication provider?

The legislation is extremely broad in who it applies to. It includes the obvious entities like carriers or carriage service providers (think Telco’s). But what is more concerning is the broader categories that are included, for example, it covers a person that “… provides an electronic service that has one or more end users in Australia” which appears to cover every website that is accessible from Australia. Furthermore, the legislation also covers an individual if “…the person develops, supplies or updates software used, for use, or likely to be used, in connection with: (a) a listed carriage service; or (b) an electronic service that has one or more end users in Australia”, which appears to cover every piece of software, or mobile app, that connects to internet or produces content that is going to be used on the internet. That is an incredibly broad category, the justification for which is not clear. It goes further to cover any corporation that creates software that may run on a device that will be connected to a telecommunication network, irrespective of whether the software itself is intended for use over that communication network.

Since the legislation will be applicable to both individuals and companies, there is need for clarification as to whether an employee could be the subject of a notice or request as a result of their job. For example, could a notice be issued against a software developer in a company, but not the company itself? Where would an employee stand in terms of the secrecy requirements and the immunity clauses? Could a notice be issued against an individual to act against their employer? What safeguards are there for both the company and the individual?

Requesting to be Noticed

At the heart of the bill are three different types of request/notice, each one subtly different in scope, power, and accountability.

Type Compliance Issued By1 Can Include New Development Included in Annual Audit Disclosure
Technical Assistance Request Voluntary DGS, DGSIS, DGASD, COIA Yes No
Technical Assistance Notice Compulsory DGS, COIA No Yes
Technical Capability Notice Compulsory AG Yes, but not removing Encryption Yes

1The Director General of Security (DGS), The Director General of the Australian Secret Intelligence Service (DGSIS), The Director General of the Australian Signals Directorate (DGASD), The chief officer of an interception agency (COIA), Attorney General (AG)

Technical Assistance Notice

These appear to be intended to gain access to existing capabilities that a communication provider may have. If new developments are required, they should be requested using a Technical Capability Notice. A Technical Assistance Notice is a compulsory notice that must be complied with. There are limits on what is considered reasonable, but it is not clear how feasible it will be for a recipient to challenge the determination of reasonableness.

Technical Capability Notice

A Technical Capability Notice is primarily intended to require the building of a new capability to be able to subsequently meet a Technical Assistance Notice. They can request the building of any capability listed in the legislation, with the exception of 371E(1)(a):

“(a) removing one or more forms of electronic protection that are or were applied by, or on behalf of, the provider;”

The explanatory note, and even the legislation, draw particular attention to this apparent limitation.

Technical Assistance Request

Technical Assistance Requests are described as voluntary requests, as such, there is no criminal or civil penalty for not complying with them, although they are covered by the same secrecy provisions. It is my view that these are the real objective of the legislation, not the compulsory notices. The requests are defined differently to both of the notices, and have few, if any, limitations on what they can request. Furthermore, they are excluded from essential oversight, by virtue of not being included in the annual report issued by the minister (see 317ZS).

Of greatest concern is that the constraints in Division 7 Limitations do not apply to Technical Assistance Requests. As such, there is no restriction on a Technical Assistance Request asking for the implementation of a Systemic Weakness. Likewise, unlike Technical Capability Notices, there is no restriction on requesting the development of new capabilities to remove electronic protection (317E(1)(a)). The explanatory note states:

“A technical assistance request can ask a provider do a thing currently within their capacity or request that they build a new capability to assist agencies.”

As such, Technical Assistance Requests are permitted to request more than both the notices combined, they are largely unbounded, being able to request assistance beyond that which is listed in 317E(1) “…provided that the assistance is of the same kind, class or nature as those listed.”

Hiding in the Shadows

In general, the contents of requests and notices, and even the receipt of such requests and notices cannot be publicly disclosed. In regard to Technical Capability Notices this is of particular concern. Capability notices are not operational notices, they are not being applied in the context of an active threat, they are about building capability. Any time the state is building secret capabilities is a cause for concern. In the past these capabilities were generally outwardly focussed - i.e. defence capabilities, that were never intended to be targeted on the population itself. In this case it is different, the state is building secret capabilities that are specifically being targeted at Australians. That presents a dangerous precedent, potentially shifting power and sovereignty away from the population. Whilst there is justification for not revealing active operations, keeping capabilities secret risks preventing public oversight, and is likely to lead to abuse. Governments need to remember they are the representatives of the people, not their rulers.

Whilst the details of notices and requests are protected from disclosure, there is permission for disclosing high level statistics. 317ZF(13) permits the communication provider to disclose

(e) the total number of technical assistance notices given to the provider during a period of at least 6 months; or
(f) the total number of technical capability notices given to the provider during a period of at least 6 months; or
(g) the total number of technical assistance requests given to the provider during a period of at least 6 months.
Note: This subsection authorises the disclosure of aggregate statistical information. That information cannot be broken down:
(a) by agency; or (b) in any other way.

Whilst some disclosure is good, it would be better if a communication provider was mandated to report such statistics, particularly in regard to Technical Assistance Requests. Declaring such information is in the best interests of transparency and the public, but is probably not in the best interests of the communication provider. Revealing voluntary assistance could have a negative impact on the public image of the provider. As such, without a mandatory disclosure requirement, the scale of the application of Technical Assistance Requests could disappear completely from public oversight. It is also worth noting that there is no ability to distinguish between technical assistance requests that were complied with and those that were refused. As such, excessive assistance requests could be issued to a provider in order to harm their public image should they wish to declare the totals.

I’m loath to point out there also appears to be a flaw in their attempt to hide the details of when notices were issued by requiring aggregate results to be declared for “…a period of at least 6 months;” An organisation could seemingly comply with that requirement by having a rolling 6 monthly declaration on their website. By differencing the total number between one day and the next it is likely to reveal the number of notices issued either on that day 6 months previous or the previous day.

Careful what you say

317ZA deals with compliance of notices but raises a number of questions. 317ZA is specific to “…carriers and carriage service providers”, and their compliance with requirements contained within capability and assistance notices. However, 317ZA(2) seems to be applicable to all persons, not just carriers and carriage service providers. It states

A person must not:
(a) aid, abet, counsel or procure a contravention of subsection(1); or (b) induce, whether by threats or promises or otherwise, a contravention of subsection(1); or (c) be in any way, directly or indirectly, knowingly concerned in, or party to, a contravention of subsection(1); or (d) conspire with others to effect a contravention of subsection(1).

On the face of it the above might seem reasonable, however, recall that a person cannot possibly know that a capability or assistance notice has been issued to a carriage service provider. This presents a particular issue to part (a), since there is no requirement to knowingly aid, abet, or counsel. As such, could part (a) be applied to teaching? What happens if a university subject teaches students how to implement secure end-to-end encryption, which in doing so aids an individual to cause the contravention of subsection(1)?

This part of the legislation needs far more clarity and precision, currently it is far too broad and risks impinging on good security practice through fear of contravening an unknowable notice. The explanatory note fails to provide any clarity, by being even briefer that the legislation itself, stating just

“Persons are prohibited from aiding, abetting, inducing or conspiring to affect a contravention of subsection 317ZA(1).”

Systemic Weaknesses

The issue of System Weaknesses is made a big deal of in the legislation and explanatory note. It seems like it is an attempt to comply with the claim of not mandating backdoors. However, the term isn’t defined anywhere. Furthermore, what is described remains a backdoor, albeit a keyed backdoor. There is no requirement for backdoors to be universally exploitable to be considered a backdoor, it merely needs to provide an alternative entry point into the target system or protocol. The only compromise appears to be that they have realised that in fact the laws of mathematics do apply in Australia and that the backdoor needs to be relocated somewhere else. That isn’t really an improvement, it is just a technicality.

The description of a Systemic Weakness seems somewhat contradictory. At one point the explanatory note states:

“For the avoidance of doubt, this includes a prohibition on building a new decryption capability or actions that would render systemic methods of authentication or encryption less effective. The reference to systemic methods of authentication or encryption does not apply to actions that weaken methods of encryption or authentication on a particular device/s. “

later it states:

“Notices may still require a provider to enable access to a particular service, particular device or particular item of software, which would not systemically weaken these products across the market. For example, if an agency were undertaking an investigation into an act of terrorism and a provider was capable of removing encryption from the device of a terrorism suspect without weakening other devices in the market then the provider could be compelled under a technical assistance notice to provide help to the agency by removing the electronic protection

From a security perspective the fact that the provider has the capability to remove encryption from a device is a systemic weakness. It is not the application of the weakness that should be evaluated, it is the existence and capability to exploit it that should be considered.

Keyed Backdoors

The usage of keyed backdoors is not new, that is exactly what the NSA attempted to create through the Dual Elliptic Curve Deterministic Random Bit Generator (Dual-EC DRBG). It was approved as a part of a NIST suite of Cryptographically Secure Pseudorandom Number Generators. However, it contained a keyed backdoor, which would have permitted the NSA to potentially recover randomness. Why is that important, because cryptography is particularly reliant on randomness to provide protection. If your generation of randomness is predictable, or recoverable, it leads to key recovery and ultimately message recovery, rendering the otherwise secure cryptographic protocol broken. It is one of the most effective and powerful cryptographic backdoors, because it is difficult to detect, it is applicable to current and new protocols, and can only be accessed by those with the secret key. Such approaches are not just the reserve of the NSA, an alternative keyed attack was found in Juniper networks switches.

Would such an approach constitute a Systemic Weakness, these aren’t theoretical attacks, they are practical and been used in the wild. The explanatory note should clarify where such techniques would fall within the legislation.

Backdooring devices or systems

Outside of attacking the crypto, the obvious avenue for attack, and the one described in the explanatory note, is attacking the data before it is encrypted.

“…a notice may require a provider to facilitate access to information prior to or after an encryption method is employed, as this does not weaken the encryption itself. A requirement to disclose an existing vulnerability is also not prohibited.”

This is where the future threat could ultimately lay. If I was asked what I would target to backdoor a device I would say the keyboard interface. It is currently the weakest point on most devices, be they laptops, tablets, or mobile phones. The keyboard device, or software equivalent, captures everything entered before it is encrypted. It sees every username, password, and message entered. It would be an easy and low-cost point to attack, and would be difficult for most users to detect.

The Future

Legislation does not get written by accident, everything is written for a reason, with a specific objective. That objective may not be made clear in the legislation, or possibly not even publicly, and it is for this reason that we must look not just at what is written in the legislation and explanatory note, but also what is not written. This is even more pressing for legislation whose outcomes will be hidden from public view. Badly written legislation is never a good thing, but abuse of badly written legislation becomes self-evident through its application in the public judicial system. Legislation that will be enforced in secret is rare, it is even rarer for such legislation to be applied to the population itself. As such, the only real chance of oversight we have of this legislation is before it is enacted, after that it will be shrouded in so much secrecy that it will be almost impossible to challenge or oversee. It is essential that the public, advocacy groups, and businesses makes their thoughts known during the feedback period: this might be your only chance!

In terms of my opinion, I believe the Technical Assistance Requests are the real objective of the legislation. They are incredibly broad, and their exclusion from what little oversight that does exist in the legislation is particularly concerning. It is my opinion that the Technical Capability Notices are there as a distraction, which will draw the focus of challenges, but will ultimately be sacrificed/compromised by the government, leaving the Technical Assistance Requests largely untouched. The argument that the Technical Assistance Requests are voluntary is highly debatable, the government wields enormous amounts of soft power, the suggestion that companies, whose revenue could be impacted by the exercising of that soft power, are going to sacrifice profit for their users is fanciful. Not only are those companies some of the most prolific players in surveillance capitalism, but they had no qualms in sharing their user information with intelligence agencies in the past. They only claimed to be champions of user privacy when it became public how they had so conclusively betrayed those very same users. I view that as little more than superficial repentance in a cynical attempt to save their brands. I have no doubt those same organisation would have no issue in voluntarily complying with interception requests if they could guarantee the public would never find out, which is exactly what this legislation provides.

How could the legislation be improved (apart from being shredded):

  • Technical Assistance Requests should be covered by the same limitations as described in Division 7, namely they should not be able to request systemic weaknesses, nor develop new techniques for removing electronic protection
  • Technical Assistance Requests should be included in the annual report
  • Recipients of notices or requests should be mandated to provide transparency reports, including all requests and notices
  • Clarity must be provided on aiding and abetting contravention of notices issued against carriers
  • The scope of the legislation should be further restricted to only the most serious of crimes or threats to national security
  • The application of secrecy provisions to capabilities should be reviewed
  • A more precise definition and description of Systemic Weakness is also required

References

Security analysis of the NHS COVID-19 App

Detailed security analysis of the NHS COVID-19 App by Chris Culnane and Vanessa Teague Continue reading