What Businesses Need to Know about the U.S. Executive Order on AI | Virginia Benefits Team 

Recently, U.S. President Joe Biden issued an executive order designed to install guardrails on generative artificial intelligence (AI), and there are many implications for businesses. Human Resources must be mindful about these new regulations as they apply to the workforce.

The comprehensive order provides guidance on the development and use of a technology that has been unregulated up to this point. Biden aims to protect privacy, promote equity and fairness, and protect workers from generative AI, which has shown promise in its ability to help people become more productive while also threatening their livelihood and potentially being a danger to national security and the safety of individuals.

Ultimately, with this executive order, the President is trying to ensure the United States, which is the leader in generative AI development, uses the tech for good rather than evil. Many experts have suggested that this is an attempt to learn from mistakes that leaders made in the early days of social media. The executive order covers many aspects of generative AI and the potential dangers. But here are the most important details regarding business and the workforce:

Risk Management Is a Must

Human Resources leaders should pay close attention to how they are using artificial intelligence. Recruiting, with 43% of the vote, topped the list of areas in HR that would be most impacted by generative AI, according to the State of HR survey. But one of the biggest concerns people have is the bias inherent in these platforms.

Those using AI for recruiting or decisions about talent must be aware of the consequences if they are not putting a human check on the potential bias. After all, in the executive order, the Department of Justice and Federal civil rights offices will be coordinating on best practices for investigating and prosecuting civil rights violations related to AI. Working closely with employment lawyers, checking one’s mental compass, and having a secondary or even tertiary check on practices, are must-dos when using generative AI.

Behave Yourself

“Develop principles and best practices to mitigate the harms and maximize the benefits of AI for workers by addressing job displacement; labor standards; workplace equity, health, and safety; and data collection. These principles and best practices will benefit workers by providing guidance to prevent employers from undercompensating workers, evaluating job applications unfairly, or impinging on workers’ ability to organize,” according to the White House’s fact sheet on the executive order.

This part of the executive order is probably of the most significance to business. However, the wording is vague and it is unclear how leaders must respond. But the point is that employers must respond. Experts on AI have warned the public about how leaders must take immediate action to set guidelines and determine the best ways to use this impressive tool in a way that makes sense and does not damage people. If people wait, then the guidelines will be set by others, who do not necessarily have the best intentions, or the machines themselves will determine use. This executive action puts the onus squarely on employers and provides a bit of a push.

“We have a moral, ethical, and societal duty to make sure that AI is adopted and advanced in a way that protects the public from potential harm,” said Vice President Kamala Harris at the White House, as reported by The New York Times. “We intend that the actions we are taking domestically will serve as a model for international action.”

It’s important to note that Europe is already moving forward with its own regulations on AI.

Expect More Information

Human Resources professionals must continue to pay attention to the government and its response to AI. The government will be producing a report and suggestions for “strengthening federal support for workers facing labor disruptions, including from AI,” according to the fact sheet. Employers should expect more direction once this report is completed and released.

In the meantime, the number of workers using generative AI will grow, and companies will consider how they can best leverage the technology.

Get More Support for Global Hiring

Developing and using generative AI, at least at this point, requires skilled workers. And the government recognizes the challenges of global hiring for employers, and it has addressed the problem in the executive order:

“Use existing authorities to expand the ability of highly skilled immigrants and nonimmigrants with expertise in critical areas to study, stay, and work in the United States by modernizing and streamlining visa criteria, interviews, and reviews.”

Visas have been tricky for employers for some time now. This may make it easier to gain access to skilled workers from abroad. On the flip side, the order also addresses American workers abroad. In the order, the government expresses interest in collaborating and leading efforts to develop and manage AI with global partners.

Embrace Regulation

Usually, big business wants government to stay out of their work as much as possible. But regulating generative AI garners a different reaction, according to the Times:

“While businesses often chafe at new federal regulation, executives at companies like Microsoft, Google, OpenAI, and Meta have all said that they fully expect the United States to regulate the technology — and some executives, surprisingly, have seemed a bit relieved. Companies say they are worried about corporate liability if the more powerful systems they use are abused. And they are hoping that putting a government imprimatur on some of their A.I.-based products may alleviate concerns among consumers.”

Some are worried that these kinds of regulations could stifle creativity and innovation in development and advancement of AI. Tech companies that are developing AI must “red team” or seek out problems with the technology, and many of them are voluntarily doing this on their own. However, Biden’s executive order requires the government to set new standards and tests for this red teaming. In addition, companies must inform the government of these potential problems that are found before releasing these systems, according to NPR. This will be of interest at companies where HR is charged with compliance and regulatory duties.

This executive order on AI is only a first step, and it is quite limited in scope. Biden has suggested that this is not the best solution. He would prefer that Congress take action to devise actual legislation around the development and use of AI. In the meantime, the executive order is on the books for at least the duration of Biden’s presidency. The problem with executive orders has always been that when new presidents are elected, they can undo previous executive orders, which is why a law is more permanent and preferred.

By Francesca Di Meglio

Originally posted on HR Exchange Network


Posted

in

by