August 2021

DABUS Down Under take 3

Following last month’s ruling in Thaler that an AI could be an inventor for the purposes of Australian patent law, the Commissioner of Patents has announced her intention to appeal the decision to the Full Court.

Pursuant to s 158(2), the Commissioner requires leave to appeal. Bearing in mind that Beach J’s decision is the first substantive consideration anywhere in the world to accept that an AI could be an inventor for the purposes of the Patents Act, however, that should not prove too much of an obstacle in this case.

Thaler v Commissioner of Patents [2021] FCA 879

DABUS Down Under take 3 Read More »

Thaler: the robots have arrived DownUnder

In what may well be a world first,[1] Beach J has upheld Thaler’s appeal from the Commissioner, ruling that an AI can be an inventor (or at least that someone who derives title to the invention from an AI can be an entitled person).

Stephen L. Thaler applied for a patent, AU 2019363177, entitled “Food container and devices and methods for attracting enhanced attention”.[2] The application named the inventor as:

DABUS, The invention was autonomously generated by an artificial intelligence

The application was made through the PCT so, as a result, reg. 3.2C(2)(aa) required the applicant to provide the name of the inventor. The Commissioner had used the identification provided to reject the application on the basis that an “inventor” must be a natural person, which an AI obviously was not.

Beach J rejected this approach. At [10], his Honour summarised his conclusions:

in my view an artificial intelligence system can be an inventor for the purposes of the Act. First, an inventor is an agent noun; an agent can be a person or thing that invents. Second, so to hold reflects the reality in terms of many otherwise patentable inventions where it cannot sensibly be said that a human is the inventor. Third, nothing in the Act dictates the contrary conclusion.

There are a number of strands to his Honour’s reasoning. Perhaps, the most striking feature of his Honour’s reasoning is his Honour’s emphasis the role of patents in encouraging technological development and the importance of not impeding progress by shutting out new developments.

DABUS

Dr Thaler was the owner of the copyright in DABUS’ source code. He was also responsible for and the operator of the computer on which DABUS operated. He did not, however, produce the claimed invention.

As the description of the “inventor” indicates, the claimed invention was an output from the operation of DABUS.

Beach J did not propose to offer a definition of “artificial intelligence”. His Honour considered that DABUS, at least as described in Dr Thaler’s evidence, was not just a “brute force computational tool”. Instead, it was appropriate to describe DABUS as semi-autonomous.[3]

DABUS itself consisted of multiple neural networks. At first, the connections between these networks was trained with human assistance by presentation of fundamental concepts. As the system accumulated knowledge, the networks connected themselves into increasingly longer chains. And then DABUS became capable of generating notions itself – the unsupervised (by humans) generative learning phase. Once in this phase, DABUS was capable of randomn generation of concepts and identifying those which were novel. In addition, it was trained or programmed to identify significant concepts which continued operation further reinforced. At [41], Beach J summarised:

The upshot of all of this, which I accept for present purposes, is that DABUS could be described as self-organising as a cumulative result of algorithms collaboratively generating complexity. DABUS generates novel patterns of information rather than simply associating patterns. Further, it is capable of adapting to new scenarios without additional human input. Further, the artificial intelligence’s software is self-assembling. So, it is not just a human generated software program that then generates a spectrum of possible solutions to a problem combined with a filtering algorithm to optimise the outcome.

Beach J accepted at [42] Dr Thaler’s evidence that:

DABUS, and its underlying neural paradigm, represents a paradigm shift in machine learning since it is based upon the transient chaining topologies formed among associative memories, rather than activation patterns of individual neurons appearing within static architectures ….

Beach J’s reasons

At [118], Beach J pointed out that no specific provision in the Act precluded an artificial intelligence from being an “inventor”.

Secondly, Beach J considered that patent law is different to copyright law which specifically requires human authors and recognition of moral rights.

Thirdly, as it was not defined in the Act, the term “inventor” should be given its ordinary meaning. Dictionary definitions did not help with this as more was required “than mere resort to old millennium usages of that world.”[4] Instead, at [120]:

the word “inventor” is an agent noun. In agent nouns, the suffix “or” or “er” indicates that the noun describes the agent that does the act referred to by the verb to which the suffix is attached. “Computer”, “controller”, “regulator”, “distributor”, “collector”, “lawnmower” and “dishwasher” are all agent nouns. As each example demonstrates, the agent can be a person or a thing. Accordingly, if an artificial intelligence system is the agent which invents, it can be described as an “inventor”.

Importantly, Beach J took into account the purpose of the patents system. At [121], his Honour explained:

in considering the scheme of the Act, it has been said that a widening conception of “manner of manufacture” is a necessary feature of the development of patent law in the twentieth and twenty-first centuries as scientific discoveries inspire new technologies”.[5]

Accordingly, it made little sense to apply a flexible conception of subject matter – “manner of (new) manufacture under s 18(1)(a) – but a restrictive interpretation of ”inventor“. This would mean an otherwise patentable invention could not be patented because there was no ”inventor”.

Beach J’s purposive approach was reinforced by reference to the newly enacted objects clause:

The object of this Act is to provide a patent system in Australia that promotes economic wellbeing through technological innovation and the transfer and dissemination of technology. In doing so, the patent system balances over time the interests of producers, owners and users of technology and the public.

It was not necessary to identify an ambiguity before resorting to the objects clause. The objects clause should always be considered when construing legislation.

It was consistent with the object of the Act to interpret “inventor” in a way which promoted technological innovation. Allowing “computer inventorship” would incentivise computer scientists to develop creative machines. Moreover, the object of the Act would not be advanced if the owners of creative computers resorted to trade secret protection instead of the patent system.

At least, with respect, his Honour’s approach shows that s 2A can operate in favour of technological development by encouraging patenting rather than, as many feared, a cat’s paw justifying resort to an ex post analysis[6] in favour of ‘user rights’.

In a further bold development, Beach J considered ([135] – [145]) that the Act is “really concerned” with inventive step. Under s 7(2), the issue was whether or not the patent claimed a sufficient technological advance over what had gone before to warrant the grant of a monopoly. How that advance was made was not relevant.

Descending to the detail of s 15, Beach J first pointed out at [157] that s 15 is directed to who may be granted a patent and does not define who is an inventor. Beach J accepted that DABUS was not a person and so under s15 could not be a person entitled to the grant of a patent. Even though s 15(1)(a) identified “a person who is the inventor” as the first person entitled to the patent, this was not determinative. It was directed to a different issue than definition of “an inventor”.

Although DABUS could not be an entitled person, Beach J found that Dr Thaler qualified as an entitled person at least on the basis of s 15(1)(b). At [167], his Honour explained:

Dr Thaler is the owner, programmer and operator of DABUS, the artificial intelligence system that made the invention; in that sense the invention was made for him. On established principles of property law, he is the owner of the invention. In that respect, the ownership of the work of the artificial intelligence system is analogous to ownership of the progeny of animals or the treatment of fruit or crops produced by the labour and expense of the occupier of the land (fructus industrialis), which are treated as chattels with separate existence to the land.

By this means, his Honour neatly side-stepped philosophical questions, some of which he had adverted to earlier. For example, at [131] Beach J asked:

If the output of an artificial intelligence system is said to be the invention, who is the inventor? And if a human is required, who? The programmer? The owner? The operator? The trainer? The person who provided input data? All of the above? None of the above? ….

Beach J returned to this aspect of the problem at [194]:

more generally there are various possibilities for patent ownership of the output of an artificial intelligence system. First, one might have the software programmer or developer of the artificial intelligence system, who no doubt may directly or via an employer own copyright in the program in any event. Second, one might have the person who selected and provided the input data or training data for and trained the artificial intelligence system. Indeed, the person who provided the input data may be different from the trainer. Third, one might have the owner of the artificial intelligence system who invested, and potentially may have lost, their capital to produce the output. Fourth, one might have the operator of the artificial intelligence system. But in the present case it would seem that Dr Thaler is the owner.

As Dr Thaler combined in the one person the roles of owner, programmer and operator, he was entitled to the fruits of its operation. Would different problems arise if, instead of being embodied in the one person, each of the functions identified lay in a different person?

Returning to [131], Beach J continued immediately following the extract quoted above, saying:

…. In my view, in some cases it may be none of the above. In some cases, the better analysis, which is consistent with the s 2A object, is to say that the system itself is the inventor. That would reflect the reality. And you would avoid otherwise uncertainty. And indeed that may be the case if the unit embodying the artificial intelligence has its own autonomy. What if it is free to trawl the internet to obtain its own input or training data? What about a robot operating independently in a public space, having its own senses, learning from the environment, and making its own decisions? ….

If one can start with the AI as the inventor, that arguably simplifies the analysis in terms of entitlement as the person claiming to be the entitled person will need to show some claim over the results of the operation of the machine.

One final point. It is not clear from the reasons the extent to which Beach J was referred to the controversies around the world arising from Dr Thaler’s applications. It didn’t matter.

At [220], his Honour pointed out that they were irrelevant to the task before him: the interpretation of the words of the Australian Act.

I am not at all sure that “the world” has reached a settled position about the question whether AIs can be inventors. One would hope, however, Australian law does not head down yet another path where “we” are granting patents for things which are not patentable in their ‘home’ countries. It will, for example, be another 12 years or so before we are finally in something like parity on inventive step with the “rest” of the world and have escaped the constraints of the pre-Raising the Bar tests of inventive step.

Two questions

This short summary cannot do justice to the detailed arguments developed over 228 paragraphs.

As discussed above, Beach J accepted that DABUS could not be granted a patent as it was not a person and s 15 specifies that the grantee of a patent must be a person. The Commissioner had proceeded on the basis that the terms of s 15 were predicated on the “old millennium” understanding that title to an invention flowed from the person who was the inventor just as subsistence and ownership of copyright flows from the person who is the author of the work. Beach J has sidestepped that on the basis that s 15 is concerned only with entitlement, not definition of who is an inventor. Nonetheless, one might think s 15 was drafted in this way on the basis that entitledment flowed from the inventor. It does also seem somewhat curious that an AI can be an inventor but not entitled to a patent because it is not a person.

Secondly, much of the controversy overseas has been about whether Dr Thaler’s creation acts autonomously or semi-autonomously or is just an exercise in brute computing. See for example Rose Hughes’ report on IPKat. Beach J appears to have had some evidence from Dr Thaler about how DABUS was designed and worked. Thus at [43], his Honour accepted Dr Thaler’s “assertion” that:

DABUS, and its underlying neural paradigm, represents a paradigm shift in machine learning since it is based upon the transient chaining topologies formed among associative memories, rather than activation patterns of individual neurons appearing within static architectures. From an engineering perspective, the use of network resonances to drive the formation of chaining topologies, spares programmers the ordeal of matching the output nodes of one [artificial neural network] with the input nodes of others, as in deep learning schemes. In effect, complex neural architectures autonomously wire themselves together using only scalar resonances.

Reinforcement or weakening of such chains takes place when they appropriate special hot button nets containing memories of salient consequences. Therefore, instead of following error gradients, as in traditional artificial neural net training, conceptual chains are reinforced in proportion to the numbers and significances of advantages offered. Classification is not in terms of human defined categories, but via the consequence chains branching organically from any given concept, effectively providing functional definitions of it. Ideas form as islands of neural modules aggregate through simple learning rules, the semantic portions thereof, being human readable as pidgin language.

Later his Honour asked at [127] – [128]:

Who sets the goal for the system? The human programmer or operator? Or does the system set and define its own goal? Let the latter be assumed. Further, even if the human programmer or operator sets the goal, does the system have free choice in choosing between various options and pathways in order to achieve the goal? Let that freedom also be assumed. Further, who provides or selects the input data? Let it be assumed that the system can trawl for and select its own data. Further, the larger the choice for the system in terms of the algorithms and iterations developed for the artificial neural networks and their interaction, the more autonomous the system. Let it be assumed that one is dealing with a choice of the type that DABUS has in the sense that I have previously described.

Making all of these assumptions, can it seriously be said that the system is just a brute force computational tool? Can it seriously be said that the system just manifests automation rather than autonomy? ….

If by “assumptions” his Honour is referring to the evidence from Dr Thaler at [42] and [43] which his Honour accepted, that is one thing. It may be quite another thing if they were in fact assumptions.

Where to now?

At the time of writing, the Commissioner is understood to be considering whether or not to appeal.

Thaler v Commissioner of Patents [2021] FCA 879


  1. The EPO refused the corresponding patent application with the oral hearing of the appeal to be heard on 21 December 2021. Apparently, the corresponding patent has been granted in South Africa which, I am given to understand, effectively does not operate a substantive examination system.  ?
  2. For an interesting discussion, see “The first AI inventor – IPKat searches for the facts behind the hype” and the later report on the USPTO’s consultations “Is it time to move on from the AI inventor debate? ?
  3. At [19] – [29], Beach J provides an overview of how his Honour understands artificial neural networks work.  ?
  4. At [15]. For the lawyers, [148] – [154] set out the technical arguments for the limitations of dictionaries in some detail.  ?
  5. Diplomatically (and consistently with precedent) citing D’Arcy v Myriad Genetics Inc (2015) 258 CLR 334 at [18] perhaps the single biggest retreat from the teleological approach declared in NRDC.  ?
  6. For example, Mark A Lemley, ‘Ex Ante versus Ex Post Justifications for Intellectual Property ?

Thaler: the robots have arrived DownUnder Read More »