The emergence of Generative AI presents corporate boards of directors with a present-day challenge. Will Generative AI disrupt companies and entire industries? Some estimates have indicated that Generative AI will automate over 40 percent of business tasks and create business value worth more than $400 billion. The public version of Chat GPT, an application of Generative AI, created over 100 million users in a few weeks of its release. The potential impact extends to job displacement. What if a large majority of white-collar tasks can be performed more effectively with AI?
I recently attended the Wall Street Journal Tech Live event, and wrote about Artificial General Intelligence (AGI) And The Coming Wave. At the WSJ event, venture investor Vinod Khosla forecast that, “AI will be able to replace 80% of 80% of all jobs within 10-20 years”. Author and AI pioneer Mustafa Suleyman noted, “Within the next few years, AI will become as ubiquitous as the Internet”, and asked, “Will AI unlock secrets of the universe or create systems beyond our control?”
What is the responsibility of corporate boards when it comes to Generative AI? Are corporate board members sufficiently equipped to consider the opportunities as well as risks, and guide corporations through their shareholder and stakeholder responsibilities? While Generative AI has the potential to revolutionize the way we do business, there is equal potential for good or harm, at scale. These are the risk and reward factors that corporate board members must consider.
Generative AI and the responsibilities of corporate boards of directors was the topic of discussion at a meeting on November 1 of the New York chapter of the National Center of Corporate Directors (NACD). The discussion was hosted and moderated by Ash Gupta, the former and longtime Global President of Risk and Information Management for American Express. I was a guest panelist along with Heidi Lanford, the former Global Chief Data Officer for Fitch Group, which is comprised of Fitch Ratings and Fitch Ventures, and is wholly owned by the Hearst Corporation. The NACD discussion focused on the steps and actions that corporate boards must undertake to safely embrace Generative AI. These include:
1. Strategic implications of bringing AI into the corporation
2. The role of boards as enablers
3. Legal, fairness, and transparency considerations
4. Monitoring, learning, and accelerating progress.
Potential risk for any company will depend upon the nature of the business problem that Generative AI is being used to solve. Examples include creating operating efficiencies, enhancing customer cross-sell, improving risk management, or driving product and servicing innovation. The recommended course of action will be dependent upon factors including industry regulation, skill sets of the organization, and whether safeguards have been put in place to mitigate potential risks. Heidi Lanford notes, “Monitoring and governance is needed. However, for use cases on the “offense” side of AI, I prefer to set up guardrails as opposed to heavy handed governance”.
How prepared are corporate board members for Generative AI? Author Tom Davenport, in a recent Forbes article, Are Boards Kidding Themselves About Generative AI?, raises a warning flag. Davenport notes that 67% of board members interviewed for a recent industry survey characterized their knowledge of Generative AI as “expert” (28%) or “advanced” (39%). Davenport expresses his skepticism, noting, “This level of expertise among board members seems rather unlikely. I doubt that 29% of even formally trained computer scientists fully understand the underlying transformer models that make generative AI work. I have been studying them for a couple of years now, and I wouldn’t put myself in the expert category”. Board directors may want to take heed.
The limitations of board understanding may not be unique to the current experience of boards with Generative AI. A recent Wall Street Journal article was headlined, Boards Still Lack Cybersecurity Expertise, noting that “Just 12% of S&P 500 companies have board directors with relevant cyber credentials”, referencing a November 2022 WSJ research study showing that just 86 of 4,621 board directors in S&P 500 companies had relevant experience in cybersecurity. It would be expected that with the newness of Generative AI, the level of relevant experience would be even lower.
One solution may be to recruit new corporate board members who possess skills in this area. Inderpal Bhandari, who previously served as Chief Data and Analytics Officer for IBM, recently joined the board of directors of Walgreens Boots Alliance. Bhandari notes, “Cybersecurity threats and technology-driven reinvention of business models and products, are the needs of the day. Today’s board must possess not just tech-savvy but perhaps even a technical instinct to provide effective governance.” He adds, “While well-established education opportunities for cybersecurity are readily available for board directors, that is not the case for strategic technologies such as data and AI. There is an urgent need to boost board literacy in that direction.”
Ash Gupta suggests, “It will be the responsibility of corporate board members to understand each company’s readiness to leverage generative AI in ways that create competitive excellence, as well as mitigate business and stakeholder risk”. He notes that while over 95 percent of board members believe in the need for AI, just 28% of companies have made realistic progress. He continues, “Boards require personal commitment to developing a deep understanding of how GenAI works, how it can revolutionize the company, and perhaps most important what it cannot do”. Gupta adds, “This commitment must be both a one-time formal education and ongoing learning.”
To this end, Gupta outlines a series of steps that companies undertake to prepare corporate boards for a Generative AI future. These steps include:
1. Create critical training for the board and corporate leadership so they have an educated understanding of what is possible and what are the limitations of Generative AI.
2. Create a test and learn culture that recognizes that many ideas that initially look promising may not be of best use to the organization.
3. Think through how best to extend the knowledge of company teams through external collaborations. These might include sources of data relevant to your industry, control mechanisms, talent, and tools.
4. Make it a priority to discuss progress and update no less frequently than every other board meeting.
5. Keep track of how industry leading companies are employing Generative AI.
Gupta and Lanford agree that corporations and their board members must remain vigilant. Lanford cautions, “AI must be a team sport. Boards should see evidence of broad participation. Expect that AI ideas are being solicited from across the workforce, and not just the technical experts”. And while there may be broad agreement on the need for regulation of Generative AI, it has been noted that while technology evolves week by week, legislation often takes years. Gupta adds, “’As a CEO and a Board Member, delegate to your CDO, CAO or CIO, but do not abdicate your authority”.
Lanford concludes, “Boards can ensure that there is a greater chance of success with Generative AI if there is a culture of experimentation and failure, which balances how use cases are moved into production”. Gupta echoes this sentiment, commenting, “Catastrophic mishaps can happen if your people and processes are not adequately trained. Effective implementation will require both technical and leadership understanding”. He adds, “Most likely, early ideas will not produce the desired outcomes. A deep commitment to creating a test-and-learn culture will!”