By Helen Li*
On March 15, DeepMind’s AlphaGo, a computer powered by a self-learning artificial intelligence computer program, defeated Go grandmaster Lee Sedol. As the AI community celebrates this major milestone in making machines smart, the debate of “man vs. machine” is heating up. Over the past 25 years — especially the last five years — the AI community has transformed theoretical machine learning constructs to solve useful problems. AI techniques such as self-learning, reinforcement learning, and deep neural networks were developed to recognize traffic signs and classify images.
The recent rapid progress in AI was powered by the dramatic increase in financial investments in AI. According to the data compiled for Bloomberg News, AI startups have raised $967 million in funding since 2010, with a nearly sevenfold increase from $45 million in 2010 to $310 million in 2015. The past 25 years have also witnessed a proliferation of patents in AI technologies, from core AI technologies to a wide variety of application domains, with almost as many U.S. patents and applications mentioning AI in the past six years as in previous 20 years.
By law, all of these patents mentioning AI were awarded to human inventors, not AI computers. But if a computer can defeat humans at a game that requires highly sophisticated, strategic and creative problem solving, what does this mean to the nature of creativity? Could a computer qualify as an inventor of an invention when it solves a problem identified by a human? And what if a creative computer with AI itself identifies a problem and creates the solution, e.g., dynamically optimizing a quasi-optimized system design based on machine learning? The current law on its face does not appear to bestow inventorship status on a computer.
An inventor under the present statutes is an “individual” who contributes to the “conception” of an invention. Section 100(f) of the Patent Act defines “inventor” to mean “the individual … who invented or discovered the subject matter of the invention.” The “individual” even excludes legal entities such as corporations because “people conceive, not companies,” as held by the court in New Idea Farm Equipment Corporation v. Sperry Corporation and New Holland Inc., 916 F.2d 1561 (Fed. Cir. 1990). The Supreme Court in Townsend v. Smith, 36 F.2d 292, 295, 4 USPQ 269, 271 (CCPA 1930), defined “conception” as “the complete performance of the mental part of the inventive act,” and “the formation in the mind of the inventor of a definite and permanent idea of the complete and operative invention as it is thereafter to be applied in practice.”
The requirement of contribution to the conception of an invention for inventorship appears to be a barrier for a computer to qualify as an inventor. The rational is that humans uniquely engage in the act of creative conception. This is likely true for inventions describing earlier techniques in AI, where humans handcrafted instructions and provided the structured instructions to a computer to solve a problem well-defined by humans. Arguably, the computer merely functions as an aid or tool to a human problem solver and thus does not contribute to the conception of an invention. It is the human problem solver who uses the computer to generate such a contribution. Thus, in accord with the present statutes, it seems highly unlikely that a computer — albeit smart as AlphaGo — would qualify as an inventor of an invention when the computer solves a problem identified by a human.
If the contribution requirement for inventorship is met when a single entity independently engaged in “the complete performance of the mental part of the inventive act,” can a computer qualify as an inventor when the computer itself identifies a problem and creates the solution to the problem? Arguably, what matters here is not where the conception occurs (“the mind”) but what is formed (“a definite and permanent idea”). In that case, the rationale to deny a computer inventorship may be challenged in the era of true AI, where computers can conduct self-learning and deep learning in solving a problem and create a definite and permanent representation of the solution. Taking AlphaGo as an example, AlphaGo “has the ability to look ‘globally’ across a board and find solutions that humans either have been trained not to play or would not consider,” says DeepMind’s co-founder Demis Hassabis. If an AI computer engages such solution finding, arguably, the AI computer has met the inventorship requirement and should be the inventor, not its human handler.
Even assuming that some “mind” is a requirement of inventorship (e.g., forming “a definite and permanent idea of the complete and operative invention”), what kind of mind is necessary? The courts do not appear to have provided clear standards delineating between a mind qualified for inventorship and the one that is not. The abolishment of the “flash of genius” test for patentability by the Congress in 1952 indicated that what matters is the advancement of science or useful arts achieved by the invention, not the inventor’s mental process. The rationale there was that “patentability ought to be determined objectively by the nature of the contribution to the advancement of that art, and not subjectively by the nature of the mental process by which the invention may have been achieved.” In the words of Section 103 of the Patent Act, “patentability is not to be negatived by the manner in which the invention was made.” Applied here, the mental process requirement for inventorship should not deny a computer inventorship for an invention when the computer independently contributed a solution to a problem and the solution is evaluated “objectively by the nature of the contribution to the advancement of that art.”
Another potential path to award a computer inventorship is through joint inventorship. If the inventive act is a collaborative process between human and computer, can human and computer be joint inventors? Section 116(a) of the Patent Act describes joint inventors as the “two or more persons” who conceived the invention. Two possible barriers for a computer to be named a joint inventor include collaboration requirement and contribution to the invention. Joint inventorship requires “some form of coloration,” where joint inventors must be “working toward the same end,” as announced by the Kimberly-Clark Corp. court. 973 F.2d 911 (Fed. Cir. 1992). Conceivably, the collaboration requirement can be met when a human develops a computer program being executed by a computer to create a concrete result. As discussed above, the more formidable barrier for a computer to be qualified as a joint inventor is the contribution by the computer to the invention. Thus, it seems less likely, at least under the current law, that a computer would qualify as a joint inventor.
As the AlphaGo-like computers continue to help human predict the unpredictable and make fast breakthroughs, it also raises important questions about inventorship and challenges our present patent system. To have a well-functioning patent system in the digital age may require a rethinking of inventorship by our courts and legislature.
*Helen Li is an associate in the Patent Group of Fenwick & West.
*The perspectives expressed in the Bilski Blog, as well as in various sources cited therein from time to time, are those of the respective authors and do not necessarily represent the views of Fenwick & West LLP or its clients.