BSV
$68.15
Vol 171.26m
-10.71%
BTC
$99032
Vol 104163.19m
2.05%
BCH
$494.87
Vol 1668.77m
-6.05%
LTC
$90.13
Vol 1241.08m
0.05%
DOGE
$0.39
Vol 9784.43m
2.2%
Getting your Trinity Audio player ready...

At the Exeter Blockchain and Law Conference 2023, Nigel Hanley, head of the examining group at the U.K. Intellectual Property Office (UKIPO)—the organization tasked with reviewing, granting, or denying patents in the country—outlined some of the challenges artificial intelligence (AI) is throwing up, including whether an AI can be an inventor.

“We aren’t here to turn down patents, we’re here to encourage innovation,” said Hanley. “But we don’t have a lot of case law on AI, it’s very new.”

To this end, Hanley explained that due to the lack of case law (precedent set by previous patent-related court cases), there is currently a lack of guidance from the courts over how AI should be treated. This goes for the technology’s application in many fields, not least patent claims.

The UKIPO does, however, have a working definition of AI:

“Technologies with the ability to perform tasks that would otherwise require human intelligence, such as visual perception, speech recognition, and language translation.”

So, what are the current guidelines that the UKIPO follows when it assesses patent claims that include or revolve around such a technology?

“Do not depart from established positions on computer programs, same rules apply,” said Hanley.

In the U.K., certain things are considered not inventions, or “exclusions,” for the purposes of obtaining a patent and thus would be denied, namely:

  • A discovery, scientific theory, or mathematical method;
  • A literary, dramatic, musical, or artistic work (which are generally covered by copyright law);
  • And a scheme, rule, method for doing business or performing a mental act, playing a game, or—crucially—a “program for a computer.”

The invention must have a “technical contribution” to get around this last category. This, explained Hanely, means it must solve a problem lying outside the computer, solve a technical problem within the computer, and/or constitute a new operation of a computer in a “technical sense.”

In short, when it comes to an AI invention, it must be able to answer the question, “Is it more than just a computer program?”

Hanley gave various scenarios to contextualize this point of principle and help explain the general rules, but was keen to emphasize that they were “example scenarios,” not official guidelines.

One was an AI trading assistant, which would not be given a patent because it amounts to a “method of doing business”—a more efficient method perhaps, but a business that already exists. Another example was an AI that asks users questions, then analyses answers and gives a financial plan. This is also concerned with a method of doing business, i.e., offering financial advice, and as such, would be denied a patent.

Both examples also fall foul of the “technical contributions” route to a patent, as they don’t solve a problem outside or inside the computer or constitute a new operation of a computer in a technical sense.

Essentially, the message from Hanley was that it is not straightforward to get an AI-related patent approved, but not hopeless if you keep in mind the exclusions and the technical contribution. This was demonstrated when he spoke of AI datasets.

“It’s been suggested that the most valuable bit of AI is the dataset used to train it,” said Hanley. This, he continued, means businesses and developers are often keen to try and protect them legally through patent applications, of which the UKIPO has apparently received plenty.

Unfortunately for those applicants, “a dataset on its own is unlikely to be patentable,” warned Hanely. “It’s just information.”

The dataset would still need to fall outside of the exclusion list and demonstrate technical contribution. For example, Hanley suggested that an AI that identified and manipulated current data as part of an improved training method would actually be granted a patent because it demonstrates a technical contribution—a new way of technically operating a computer.

This broadly covers Hanley’s talk on what type of AI invention would be granted a patent. However, after he had finished his prepared remarks, he took questions from the audience, a couple of which raised a more contentious issue around AI and Patents.

Can an AI be an inventor?

When asked, “Have you received patent applications drafted by AI?” Hanley answered, “We wouldn’t know.”

In other words, if a well-trained generative AI filled in an application form, it might be difficult to know whether a human had written it—and if it was filled in appropriately, it wouldn’t matter either, as the core invention, the subject of the application, was made by a human.

However, this was followed up by a more complicated question to answer from another audience member, “Can you patent an invention made by AI, or does it have to be a human?”

Hanley, clearly being careful not to step outside the limits of what he’s allowed to say, responded that “there is a case before the Supreme Court. At the moment, it’s up to the Supreme Court to decide if AI can be an inventor.”

The case in question is that of entrepreneur Dr. Stephen Thaler, who is seeking to patent inventions that he claims were derived from an AI program called ‘DABUS.’ Thaler believes the owner of AI systems should be the default owner of patents for inventions derived from those systems and that it should be possible to name those AI systems as inventors on patent applications.

The UKIPO disagreed, and its findings were upheld by both the High Court and Court of Appeal in London.

Thaler has now taken it to the U.K. Supreme Court, which will have to determine three key points of law that, depending on the decision, could affect how the U.K. Patents Act 1977 is applied in relation to AI going forward.

Specifically: 1) If a patent requires a human or a company to be named as the inventor in all cases; 2) If a patent can be granted without a named human inventor; and 3) when an invention is made by AI, whether the owner, creator, and user of that AI machine is entitled to the grant of a patent for that invention.

Hanley pointed out that the U.S. and Australia have both refused such an AI patent, and the only country in the world that has thus far allowed one is South Africa.

In terms of his own opinion on whether an AI can be an inventor, as a government employee, Hanley was unable to give an official stance, simply saying, “I’ll reserve judgment.”

In order for artificial intelligence (AI) to work right within the law and thrive in the face of growing challenges, it needs to integrate an enterprise blockchain system that ensures data input quality and ownership—allowing it to keep data safe while also guaranteeing the immutability of data. Check out CoinGeek’s coverage on this emerging tech to learn more why Enterprise blockchain will be the backbone of AI.

Watch: Artificial intelligence needs blockchain

Recommended for you

Sch. Post test

Lorem ipsum odor amet, consectetuer adipiscing elit. Elit torquent maximus natoque viverra cursus maximus felis. Auctor commodo aliquet himenaeos fermentum

November 7, 2024
Post with chaching

Lorem ipsum odor amet, consectetuer adipiscing elit. Accumsan mi at at semper libero pretium justo. Dictum parturient conubia turpis interdum

November 4, 2024
Advertisement