OECD's AI Standards Lack Force, But Could Help Navigate Development Risk

​Fenwick intellectual property partner Stuart Meyer talked to Legaltech News about a new set of intergovernmental standards for the use of artificial intelligence, laid out by the Organization for Economic Cooperation and Development (OECD).

Meyer discussed these new expectations for AI, “I think you will see people giving very close attention to these things, whether they are the force of law or not. We have de jure standards and de facto standards all the time, and some de facto standards just become sort of absolutely required. People expect them.”

AI regulation has become more of a focus in the United States in recent years. Meyer compared the increased attention on AI to that on privacy, commenting that it has really exploded over the past 10 to 20 years, and that like privacy regulations, these new AI standards will likely have different impacts in various global jurisdictions. “I think that you’ll see as those get implemented in different places, they’re going to reflect the norms in those societies,” Meyer said.

In addition, these global standards may positively impact private industry. Meyer believes that having a set of criteria to work towards could help with the design and implementation of AI products.

It is more efficient for companies to abide by the new standards while developing new products than to have to make changes later down the line. “If you create a jet airplane and then realize it only has one engine and redundancy really requires two engines, you can’t just tape another engine onto the prototype. You have to start all over,” Meyer said.

The full article is available on Legaltech News.

Meyer also co-authored a new ITechLaw book​, which covers guidelines for the responsible development, deployment and use of artificial intelligence.

Login

Don’t have an account yet?

Register