artificial intelligence

Zarya of the Dawn – copyright and an AI

Those of you who heard Shira Perlmutter, the US Register of Copyright, on her Australian tour last year will recall the US Copyright Office had withdrawn and was reconsidering the copyright registration for Zarya of the Dawn.[1] On 21 February 2023, the US Copyright Office announced the outcome of that review. While the Copyright Office allowed registration of some aspects, it rejected the claim to copyright in the images created by Midjourney, an AI.

Image of a young feminine looking person with golden skin, dark brown eyes and dark brown hair in corn rows - against a teal background
Zarya of the Dawn – Cover Page

The work(s)

Zarya of the Dawn[2] is a comic consisting of images and text depicting Zarya’s adventure to different worlds to collect mental health tools to handle their emotions, thoughts as a non-binary person.

A page with 4 comic images; the first of which is a postcard of some otherworldly place. The young, feminine looking person reads the card addressed to 'My Dearest Zarya'. They wonder if they are Zarya and why they cannot remember their name
Page 2 of the Zarya of the Dawn comic book

The Copyright Office accepted that the applicant, Ms Kristina Kashtanova, was the author of both the text and the selection and arrangement of the text and images. However, the Copyright Office refused registration for the images themselves on the grounds that they were generated by Midjourney and did not have a human author.

How Midjourney generated the images

As described by the Copyright Office, Midjourney generates an image in response to instructions (called “prompts”) input by the user. The Copyright Office illustrated this process by the prompt:

/imagine cute baby dinosaur shakespeare writing play purple

which generated the images below:

4 images generated by Midjourney of purple coloured, baby dinosaurs each holding a pen and working over a manuscript. In 2 images, the dinosaur looks to the right bottom corner; in the other two, to the right bottom corner
Baby dinosaur writing a play

The user could click on the blue “recycle” image to generate four new images. The user could also refine the images regenerated by providing URLs of images to be used as models or by providing more detailed instructions.

This was not authorship for copyright purposes

For copyright to subsist in original works such as text (literary works) or images (artistic works), US law, like Australian law (see further below), requires the work to be original. That requirement in turn requires the work to be made by a human who is an author. And, according to the Copyright Office, an author is the person “who has actually formed the picture,” the one who acts as “the inventive or master mind.”

At least in theory, if someone gave a draftsperson sufficiently detailed instructions about what a drawing should depict, they rather than the draftsperson may be the author.[3]

The Copyright Office, however, found that the instructions Ms Kashtanova gave to Midjourney did not make her the author of the resulting images. This was because it was not possible to predict the outcome resulting from her prompts:

A person who provides text prompts to Midjourney does not “actually form” the generated images and is not the “master mind” behind them. Instead, as explained above,[4] Midjourney begins the image generation process with a field of visual “noise,” which is refined based on tokens created from user prompts that relate to Midjourney’s training database. The information in the prompt may “influence” generated image, but prompt text does not dictate a specific result. See Prompts, MIDJOURNEY, https://docs.midjourney.com/docs/prompts (explaining that short text prompts cause “each word [to have] a more powerful influence” and that images including in a prompt may “influence the style and content of the finished result”). Because of the significant distance between what a user may direct Midjourney to create and the visual material Midjourney actually produces, Midjourney users lack sufficient control over generated images to be treated as the “master mind” behind them.

The Copyright Office recognised that additional prompts could be applied to initial images to influence subsequent images, however, the process was not controlled by the user as it was “not possible to predict what Midjourney will create ahead of time.”

The Copyright Office contrasted the way Midjourney works with the way an artist might use Photoshop or other tools:

The fact that Midjourney’s specific output cannot be predicted by users makes Midjourney different for copyright purposes than other tools used by artists. See Kashtanova Letter at 11 (arguing that the process of using Midjourney is similar to using other “computer- based tools” such as Adobe Photoshop). Like the photographer in Burrow-Giles, when artists use editing or other assistive tools, they select what visual material to modify, choose which tools to use and what changes to make, and take specific steps to control the final image such that it amounts to the artist’s “own original mental conception, to which [they] gave visible form.”15 Burrow-Giles, 111 U.S. at 60 (explaining that the photographer’s creative choices made the photograph “the product of [his] intellectual invention”). Users of Midjourney do not have comparable control over the initial image generated, or any final image. (emphasis supplied) (footnotes omitted)

Ms Kashtanova also contended that her modifications in Photoshop to some images constituted authorial contribution to support her claim to copyright. From the description in the Copyright Office’s decision, some of the work seems more like touching up or editing rather than authorship. As the material before the Copyright Office did not include the “before” and “after” images, however, the Copyright Office was not include to accept those claims either.

An Australian perspective

Australian courts have also ruled that an author must be a human. Applying the IceTV case, the Full Federal Court has ruled that the processing of telephone subscriber name, address and phone number details into a directory by a computerised database did not qualify as an original copyright work as there was no human author. In the first Telstra v PDC case, Perram J explained at [118] – [119]:

The Act does not presently deal explicitly with the impact of software on authorship (although this is not so in the United Kingdom: s 9(3) Copyright, Designs and Patents Act 1988 (UK)). But a computer program is a tool and it is natural to think that the author of a work generated by a computer program will ordinarily be the person in control of that program. However, care must taken to ensure that the efforts of that person can be seen as being directed to the reduction of a work into a material form. Software comes in a variety of forms and the tasks performed by it range from the trivial to the substantial. So long as the person controlling the program can be seen as directing or fashioning the material form of the work there is no particular danger in viewing that person as the work’s author. But there will be cases where the person operating a program is not controlling the nature of the material form produced by it and in those cases that person will not contribute sufficient independent intellectual effort or sufficient effort of a literary nature to the creation of that form to constitute that person as its author: a plane with its autopilot engaged is flying itself. In such cases, the performance by a computer of functions ordinarily performed by human authors will mean that copyright does not subsist in the work thus created. Those observations are important to this case because they deny the possibility that Mr Vormwald or Mr Cooper were the authors of the directories. They did not guide the creation of the material form of the directories using the programs and their efforts were not, therefore, sufficient for the purposes of originality.

The consequence of those conclusions is that the directories were not copied from elsewhere but neither were they created by a human author or authors. Although humans were certainly involved in the Collection Phase that process antedated the reduction of the collected information into material form and was not relevant to the question of authorship (other than to show that the works were not copied). Whilst humans were ultimately in control of the software which did reduce the information to a material form, their control was over a process of automation and they did not shape or direct the material form themselves (that process being performed by the software). The directories did not, therefore, have an author and copyright cannot subsist in them. (emphasis supplied)

See also Yates J at 169.

This appears to be consistent with the approach taken by the US Copyright Office although both Perram J and Yates J recognised that whether some particular claimed work falls on the “copyright” or “not copyright” side of the line is a question of judgment and degree.

Zarya of the Dawn (Registration # VAu001480196)


  1. No, as I am sure you know, you do not have to register your claim to own copyright in Australia. Registration of copyright is just one of the way Americans are different to most of the rest of us. In Australia copyright comes into existence automatically by the act of creating the material (and not slavishly copying it from some pre-existing material). There is no need to register it. Who the owner of the copyright is will depend on a number of factors such as the type of material – a literary or artistic work or an audio-visual work such as a film or a sound recording or broadcast; whether or not the work was made in the course of employment and whether there has been a written assignment or other contractual arrangement. (That is the sort of thing that requires advice based on the specific individual circumstances.)  ?
  2. This is a link to the donationware download but the Copyright Office’s decision includes images of the cover and page 2.  ?
  3. While the creation of an artistic work raises rather more challenges, an obvious illustration of this theory is the case of someone who dictates a letter or a book to an amanuensis.  ?
  4. Earlier the Copyright Office had explained: ‘… Midjourney “does not understand grammar, sentence structure, or words like humans,” it instead converts words and phrases “into smaller pieces, called tokens, that can be compared to its training data and then used to generate an image.” … Generation involves Midjourney starting with “a field of visual noise, like television static, [used] as a starting point to generate the initial image grids” and then using an algorithm to refine that static into human-recognizable images.’  ?

Thaler (DABUS) is donged Down Under

Last Friday, the clock finally ran out on Dr Thaler’s attempt to register a patent in Australia on the basis that the artificial intelligence, DABUS, was the inventor: the High Court refused special leave to appeal from the Full Federal Court’s ruling that an inventor must be a human being.

Perhaps surprisingly, the High Court did not reject the application for special leave on the grounds that an inventor for the purposes of the Patents Act must be a human being. Rather, it dismissed the application on the grounds that it is not an appropriate vehicle for the determination of the issue.

You will recall that s 15(1) of the Patents Act 1990 defines who is entitled to be granted a patent:

Subject to this Act, a patent for an invention may only be granted to a person who:

(a) is the inventor; or

(b) would, on the grant of a patent for the invention, be entitled to have the patent assigned to the person; or

(c) derives title to the invention from the inventor or a person mentioned in paragraph (b); or

(d) is the legal representative of a deceased person mentioned in paragraph (a), (b) or (c).

The Commissioner had rejected Dr Thaler’s application at the formalities stage on the basis that an inventor must be a human being. Therefore, Dr Thaler’s application failed at the formalities stage under reg. 3.2C(2)(ii) because the application identified DABUS as the inventor and DABUS was an artificial intelligence only.

It was an agreed fact before the Courts that DABUS was the “inventor”:

MR SHAVIN: …. [Dr Thaler] programmed the computer but he said that the way in which the computer was programmed is it acted independently in its selection of subject matter and in its generation of the invention. So, he says that he truly was not the inventor, but DABUS, the artificial intelligence, he says was the proper inventor.

Two or perhaps three matters seemed to be exercising the panel determining the special leave application.

First, there were questions directed to whether or not the case was simply one of either DABUS qualified as an inventor or there was no inventor at all for the purposes of the Act. One problem with that was that, as it was an agreed fact between the Commissioner and Dr Thaler there was no contradictor to the proposition. Notwithstanding the agreement between the parties, the panel appeared to consider that Dr Thaler himself might have been the inventor:

EDELMAN J: Mr Shavin, your submission would have a great deal of force if it were possible to exclude, immediately, without any possibility of argument, the possibility that the applicant was not the inventor, because then, once that possibility is excluded, one is left with either a presumption of the section that every invention must have an inventor – on your submission – that is wrong. Or, alternatively, an approach an inventor does not need to be a natural person, which meets some of the difficulties that the Full Court has identified. But the difficulty for this Court is that without having any submissions about the starting point, which is whether a natural person here could be the inventor, we are groping in the dark.

The idea being suggested here appears to be similar to questions of authorship in copyright law where there may be questions of degree such that the computer program is merely a tool like, say, Microsoft Word which an author uses to record his or her words compared to the computerised system used to generate telephone directories in the Phone Directories case where, the system having been designed and implemented, the Court found there was no human intervention.[1]

Secondly, if the Act did set up the dichotomy and an inventor had to be a human being, concerns were expressed that would mean there was a “gap” in the legislation – there could be “inventions” that could not be protected because there was no inventor. Thus:

EDELMAN J: If that factual and legal position is correct, and Dr Thaler is not the inventor, then there is a significant hole in the operation of section 15 because it means that you can have an invention but no inventor.

Thirdly, the panel was plainly aware that the status of DABUS as an inventor was an issue being litigated around the world and, in particular, the UK Supreme Court has listed for hearing on 27 February 2023 the legality of the procedural approach taken to reject Dr Thaler’s application.

Where does that leave matters?

Plainly, some sort of question mark hangs over the Full Federal Court’s approach.

So far, the Commissioner has not announced any change to practice about disallowing applications which identify an artificial intelligence as an inventor.

There may be a question whether someone who does not have Dr Thaler’s agenda will nominate an artificial intelligence as an inventor. The panel refusing the special leave application appeared to envisage that the person who owned, or controlled or programmed the computer might legally be able to claim inventorship. For example:

EDELMAN J: There is an easy way the question could have been raised, which could have been if the applicant had listed himself as the inventor and the Commissioner and had rejected that on the basis that he was not the inventor but the artificial intelligence was the inventor, which would then have given rise to the prospect that nobody, for the purposes of section 15, was the inventor.

There are, however, with respect any number of difficulties with this.

For example, as Mr Shavin KC pointed out, that might require the applicant to identify someone as the inventor which the applicant did not believe to be true.

Secondly, with the benefit of the special leave panel’s (non-binding) observations, does one nominate the owner, the controller or the programmer or some combination of all three as the inventor? If one nominated the wrong person, that might provide a ground for revoking any subsequent patent on the grounds of lack of entitlement or more likely fraud, false suggestion or misrepresentation.[2]

Thirdly, how would anyone ever know? In most (if not all) cases, the Commissioner is not going to be in a position to dispute the nomination of a person as an inventor. It might possibly come up in the context of an opposition or infringement / revocation proceedings but that would likely depend on something like the time-honoured tradition of a disgruntled ex-employee blowing the whistle.

If nothing else, if such things are to be protected as patents, it seems what we really need is some form of international agreement one whether they should be patentable and, if so, rules or guidelines for determining who is entitled to be the applicant. There has of course been no rush of international adoption of the extension of copyright to computer generated works. That problem, however, is becoming increasingly important as schoolkids (and millions of others) are happily playing with online AIs to generate their own art works, poems and other materials.

Thaler v Commissioner of Patents [2022] HCATrans 199 (11 November 2022)


  1. Telstra Corporation Limited v Phone Directories Company Pty Ltd [2010] FCAFC 149 at e.g. [118] – [120]. The trial judge, whose decision was upheld in that appeal, was also a member of the panel which refused special leave in Thaler.  ?
  2. Patents Act s 138(3)(a) and (e) – although, in the case of entitlement issues s 138(4) and s 22A may very well excuse inadvertent errors.  ?

An AI is not an inventor after all (or yet)

A strong Full Bench of the Federal Court of Australia has ruled that DABUS, an artificial intelligence, is not an inventor for the purposes of patent law. So, Dr Thaler’s application for DABUS’ patent has been rejected.[1] No doubt the robot will be back again[2] and we can expect that an application for special leave will be pending soon.

A dalek on display
By Moritz B. – Self-photographed, CC BY 2.5,

Dr Thaler had applied for a patent, No. 2019363177 entitled “Food container and devices and methods for attracting enhanced attention”, naming DABUS – an acronym for ‘device for the autonomous bootstrapping of unified sentience’ – as the inventor.

The Commissioner had rejected the application under reg. 3.2C for failure to identify the inventor. That rejection was overturned by Beach J on appeal from the Commissioner. And this was the decision on the Commissioner’s appeal.

Essentially, the Full Court ruled that an inventor for the purposes of patent law must be a natural person, not an artificial intelligence.

The Full Court held that identification of the “inventor” was central to the scheme of the Act. This is because, under s 15, only the inventor or someone claiming through the inventor is entitled to a patent.

Under the legislation before the 1990 Act, their Honours considered that an ‘actual inventor’ could be only a person with legal personality. At [98], their Honours summarised:

In each of these provisions, the ability of a person to make an application for a patent was predicated upon the existence of an “actual inventor” from whom the entitlement to the patent was directly or indirectly derived. Paragraphs (a), (c) and (e) describe the actual inventor as, respectively, a person, one that is deceased and has a legal representative (which must be a person), and one that is not resident in Australia. Paragraphs (b), (d), (f) and (fa) all contemplate an assignment happening between the patent applicant and the actual inventor. It is clear from these provisions that only a person with a legal personality could be the “actual inventor” under this legislative scheme.

This scheme, and its consequences, did not materially change under the 1990 Act.

Acknowledging that a none of the case law had to consider whether an AI could be an inventor, the Full Court noted that the ‘entitlement’ cases proceeded on the basis that ‘inventor’ meant the ‘actual inventor’. Their Honours considered the cases interpreting this expression were all premised on the ‘actual inventor’ – the person whose mind devised the claimed invention – being a natural person. At [105] and [106], their Honours explained:

None of the cases cited in the preceding five paragraphs confronted the question that arose before the primary judge of whether or not the “inventor” could include an artificial intelligence machine. We do not take the references in those cases to “person” to mean, definitively, that an inventor under the Patents Act and Regulations must be a human. However, it is plain from these cases that the law relating to the entitlement of a person to the grant of a patent is premised upon an invention for the purposes of the Patents Act arising from the mind of a natural person or persons. Those who contribute to, or supply, the inventive concept are entitled to the grant. The grant of a patent for an invention rewards their ingenuity.

Where s 15(1)(a) provides that a patent for an invention may only be granted to “a person who is an inventor”, the reference to “a person” emphasises, in context, that this is a natural person. …. (emphasis supplied)

Given that conclusion, and the structure of s 15, Dr Thaler’s argument that he was entitled on the basis of ownership of the output of DABUS’ efforts was to no avail. At [113]:

… having regard to the view that we have taken to the construction of s 15(1) and reg 3.2C(2)(aa) [i]t is not to the point that Dr Thaler may have rights to the output of DABUS. Only a natural person can be an inventor for the purposes of the Patents Act and Regulations. Such an inventor must be identified for any person to be entitled to a grant of a patent under ss 15(1)(b)-(d). (emphasis supplied)

The Full Court then drew support from the High Court’s reasoning in D’Arcy v Myriad esp. at [6] in which the majority emphasised that patentable subject matter had to be the product of “human action”.

Although not put in this way, it is apparent that policy considerations played a significant role in their Honours’ conclusion. At [119] to [120], their Honours pointed out:

in filing the application, Dr Thaler no doubt intended to provoke debate as to the role that artificial intelligence may take within the scheme of the Patents Act and Regulations. Such debate is important and worthwhile. However, in the present case it clouded consideration of the prosaic question before the primary judge, which concerned the proper construction of s 15 and reg 3.2C(2)(aa). In our view, there are many propositions that arise for consideration in the context of artificial intelligence and inventions. They include whether, as a matter of policy, a person who is an inventor should be redefined to include an artificial intelligence. If so, to whom should a patent be granted in respect of its output? The options include one or more of: the owner of the machine upon which the artificial intelligence software runs, the developer of the artificial intelligence software, the owner of the copyright in its source code, the person who inputs the data used by the artificial intelligence to develop its output, and no doubt others. If an artificial intelligence is capable of being recognised as an inventor, should the standard of inventive step be recalibrated such that it is no longer judged by reference to the knowledge and thought processes of the hypothetical uninventive skilled worker in the field? If so, how? What continuing role might the ground of revocation for false suggestion or misrepresentation have, in circumstances where the inventor is a machine?

Those questions and many more require consideration. Having regard to the agreed facts in the present case, it would appear that this should be attended to with some urgency. However, the Court must be cautious about approaching the task of statutory construction by reference to what it might regard as desirable policy, imputing that policy to the legislation, and then characterising that as the purpose of the legislation …. (emphasis supplied)

Finally, in this quick reaction, it can be noted that the Full Court recognised that their Honours’ decision was consistent with the English Court of Appeal’s decision on the counterpart application. Their Honours considered, however, there were sufficient differences in the legislative schemes that a wholly autocthonous solution should be essayed.

Commissioner of Patents v Thaler [2022] FCAFC 62 (Allsop CJ, Nicholas, Yates, Moshinsky And Burley JJ)


  1. Patent application No. 2019363177 entitled “Food container and devices and methods for attracting enhanced attention”  ?
  2. With apologies to you know who.  ?

DABUS “over there”

Judge Brinkema, sitting as a District Court Judge in the Eastern District of Virginia, has upheld the USPTO’s rejection of Thaler’s DABUS applications on the basis that DABUS cannot be an inventor under the US Act.

In the United States, Dr Thaler has two patent applications – US Application Serial Nos 16/524,350 and 16/534,532. In both, DABUS was the nominated inventor and Dr Thaler claims entitlement on the basis of assignment.

As you will no doubt recall, DABUS is a “creativity machine” or artificial intelligence.

To highlight the ludicrousnessfictional nature of the universe we are operating in, Dr Thaler as the owner of DABUS executed the assignment to himself:

In view of the fact that the sole inventor is a Creativity Machine, with no legal personality or capability to execute said agreement, and in view of the fact that the assignee is the owner of said Creativity Machine, this Assignment is considered enforceable without an explicit execution by the inventor. Rather, the owner of DABUS, the Creativity Machine, is signing this Assignment on its behalf.

When the America Invents Act was passed, amongst other things it inserted a definition of “inventor” into the Act so that 35 USC §100(f) provides:

(f) The term “inventor” means the individual or, if a joint invention, the individuals collectively who invented or discovered the subject matter of the invention.

Perhaps (with respect) unsurprisingly, Judge Brinkema ruled that “individual” meant a natural person.

In doing so, her Honour was fortified by the natural or ordinary meaning of the word. Contextually, there were also other references in the Act where Congress had used the term “individual” in reference to the inventor. (For example, §115(a)(1) and (b)(2).)

In addition, the Supreme Court had construed the term “individual” in the Torture Victim Protection Act as referring to a “natural person”. And several Federal Circuit decisions had declared that “inventors must be natural persons” albeit not in the context of the meaning of §100(f).

Judge Brinkema then explained that Dr Thaler “having neither facts nor law to support his argument” contends that policy considerations and the general purpose of the Constitution’s Patent Clause required the statute to be interpreted to permit AIs to be inventors:

Allowing patents for AI-Generated Inventions will result in more innovation. It will incentivize the development of AI capable of producing patentable output by making that output more valuable …. Patents also incentivize commercialization and disclosure of information, and this incentive applies with equal force to a human and an AI-Generated Invention. By contrast, denying patent protection for AI-Generated Inventions threatens to undermine the patent system by failing to encourage the production of socially valuable inventions.

Patent law also protects the moral rights of human inventors and listing an AI as an inventor where appropriate would protect these human rights …. [I]t will discourage individuals from listing themselves as inventors without having contributed to an invention’s conception merely because their name is needed to obtain a patent. Allowing a person to be listed as an inventor for an AI-Generated Invention would not be unfair to an AI, which has no interest in being acknowledged, but allowing people to take credit for work they have not done would devalue human inventorship.

Judge Brinkema considered that binding rulings of the Supreme Court and the Federal Circuit repeatedly held that policy arguments could not override a statute’s plain language. Her Honour also pointed out that, when Congress passed the America Invents Act, AIs were in existence and it was aware of them. Moreover, the USPTO’s own consultations had not exposed any strong support for AIs to be inventors.

Ruling against Thaler, Judge Brinkema concluded:

As technology evolves, there may come a time when artificial intelligence reaches a level of sophistication such that it might satisfy accepted meanings of inventorship. But that time has not yet arrived, and, if it does, it will be up to Congress to decide how, if at all, it wants to expand the scope of patent law.

What does this mean for Australia?

Plainly, the American context is not directly applicable to Australia since, as Beach J pointed out at [118], our Act does not have a definition of “inventor”. So, there is much greater scope for policy arguments to operate.

In that connection, the USPTO report cited by Judge Brinkema can be found here.

Ordinarily, I would be on the side of progress: the NRDC view of the world rather than D’Arcy v Myriad. Our courts, of course, must fit within the D’Arcy v Myriad world view unless Parliament were to bestir itself.

Apart from South Africa (which I understand does not undertake substantive examination of patent applications), Dr Thaler’s applications have been rejected on the ground that an AI is not an inventor by the UKIPO and EPO as well as in the USA. Government policy, which appears to have aligned with the Productivity Commission‘s argument that Australia as an intellectual property importing nation should not be out of step with the international environment, would suggest that an AI should not qualify as an inventor. Can we really afford to keep repeating the mistake made in the 3M case? However, an appeal is pending in the EPO. Maybe there will be an appeal in the USA too but the Federal Circuit’s prior indications do not augur well for the success of that.

It is also difficult to comprehend why, if as our Courts have ruled, that authors for copyright purposes must be humans, the same does not apply to inventors. Of course, our law now explicitly recognises moral rights as part of an author’s rights and there is no corresponding provision under Australian patent law. But both types of rights are justified by the same rationales – natural law or Lockean theory of property and, even, the so-called utilitarian theory.

I guess we shall see.

Thaler v Hirshfield ED VA, 2 September 2021 1:20-cv-903 (LMB/TCB)

Lid dip, Prof. Dennis Crouch at Patently-O.

DABUS Down Under take 3

Following last month’s ruling in Thaler that an AI could be an inventor for the purposes of Australian patent law, the Commissioner of Patents has announced her intention to appeal the decision to the Full Court.

Pursuant to s 158(2), the Commissioner requires leave to appeal. Bearing in mind that Beach J’s decision is the first substantive consideration anywhere in the world to accept that an AI could be an inventor for the purposes of the Patents Act, however, that should not prove too much of an obstacle in this case.

Thaler v Commissioner of Patents [2021] FCA 879

Artificial intelligences and inventions Down Under

The Commissioner of Patents has rejected the DABUS application Down Under.

Stephen L. Thaler applied for a patent, AU 2019363177, entitled “Food container and devices and methods for attracting enhanced attention”. The application named the inventor as:

DABUS, The invention was autonomously generated by an artificial intelligence

The application being made under the PCT, there was a formalities check, which, in reg. 3.2C(2)(aa), requires the Commissioner to check whether the named inventor has been identified.[1]

When the Delegate objected that an inventor had not been identified. Thaler explained why he considered DABUS was the inventor (in part):

The sole contributor to the invention is DABUS, an artificial intelligence machine that includes artificial intelligence programs written by the applicant. DABUS is capable of devising inventions without the involvement of a natural person who traditionally qualifies as an inventor. For the present invention, the machine only received training in general knowledge and proceeded to independently conceive of the invention and to identify it as novel and salient. How DABUS functions is described in detail in US Patent 10,423,875 and other patent specifications.

The Delegate understood from this response that DABUS is not a person as understood in law – an individual, a corporation or a body politic.[2]

There is no definition of “inventor” in the Act. Thus, Wilcox and Lindgren JJ had declared that the word bears its ordinary English meaning.[3]

At [12], the Delegate said:

…. Any standard dictionary shows that the traditional meaning of inventor is a person who invents. At the time that the Act came into operation (in 1991) there would have been no doubt that inventors were natural persons, and machines were tools that could be used by inventors. However, it is now well known that machines can do far more than this, and it is reasonable to argue that artificial intelligence machines might be capable of being inventors. I have no evidence whether the ordinary meaning of “inventor”, assessed at the present day, can include a machine. But if this were the ordinary meaning, would this be consistent with the other provisions of the Act?

So far as the other provisions and context provided any (limited) assistance, the Delegate considered at [20] that it was clear a patentee must be a person. This implied that an inventor also needed to be a person and, in any event, an inventor who was not a person could not be a patentee.

Although it was not part of the decision, it may also be noted that an author for the purposes of copyright must be a natural person. A computer-generated work is not an original work for the purposes of copyright as there is no author.[4] Of course, a patent can protect ideas or function while copyright protects the “expression” of ideas. At least to the extent that the rationale for granting protection in either system is the natural rights of a person to the fruits of their mental labour,[4] one would think the same considerations should apply.

Thaler has enlisted the services of Allen’s pro bono and appealed, No. VID 108/2021.

Stephen L. Thaler [2021] APO 5]


  1. Correct identification of the inventor(s) is important as a patent can be revoked if it is not granted to an “entitled person” (or all the “entitled persons” (see s 138(3)(a)) and a person can be an “entitled person” only if they can trace their chain of title back to the inventor(s): s 15(1)(a).  ?
  2. Citing Acts Interpretation Act 1901 (Cth) s 2C.  ?
  3. Atlantis Corporation Pty Ltd v Schindler [1997] FCA 1105; 39 IPR 29 at 54.  ?
  4. Telstra Corporation Limited v Phone Directories Company Pty Ltd [2010] FCAFC 149 at [90] (Keane CJ) and [117] – [119] (Perram J).  ?
%d bloggers like this: