AI Sprawl – A Disaster Waiting to Happen

“To spread or develop irregularly or without restraint.”

Sprawl is probably most familiar when used to describe the rapid and often poorly controlled spread of housing and commercial construction throughout much of suburbia.  We have found that it also appropriately describes the current state of the development and utilization of artificial intelligence (AI) in many organizations.

Does this sound familiar?  Applications spring up from various initiatives, from a variety of stakeholders, and without consistent planning or control, tools and technologies vary from project to project; data is difficult to share; incompatibilities eat time and resources.  If so, AI sprawl is likely occurring in your organization right now.  Sprawl is inefficient, but it’s also risky — irregular and unconstrained growth creates an environment where disasters are waiting to happen.

Within many organizations, leaders often don’t know where all the AI applications are deployed or what the associated risks are from the ungoverned or unchecked utilization of AI.  A leader must first accept that there is likely a problem with the development and utilization of AI within the organization.  Proper AI oversight and governance are required to protect your organization from these risks.

To make matters even more challenging, the nature of AI makes the applications difficult to understand (even to those who build them).  The term “black box” is appropriate to describe most people’s understanding of what happens within an AI application.  Fortunately, managers don’t need to become data scientists to manage the AI sprawl in their organization.  Rather, leaders need tools and processes to provide the proper oversight of AI.

Taming Your AI Sprawl

To start getting a handle on AI sprawl, the first step is to identify all AI used within the organization (even third-party AI solutions offered by a vendor).  From human resources to sales, nearly every department could have an AI-enabled tool as part of the organization’s AI inventory. Don’t be surprised if the utilization of AI is more extensive than initially considered.

The next step is to adopt an ethical framework to assess the risk associated with the items in your inventory.  There are a multitude of frameworks currently being proposed and refined.  If you are in a regulated industry, check to see what your regulators are doing.  Otherwise, it is less important exactly which framework that you adopt than that you adopt something.  A set of standards is required to start measuring what you don’t know and determine what information you will need.

Once you have established a framework and started collecting information, you can begin to understand how much sprawl your organization has and the magnitude of the associated risks.  If your organization’s AI sprawl is low, that is great. The best way to deal with AI sprawl is to avoid it from the outset, but the hard work of governance and oversight are still needed.  If AI sprawl is high, it is never too late to start developing processes and controls.  The good news is that there are increasing numbers of tools to help enable the collection of data, the implementation of controlled processes, and the required oversight of AI applications in all phases from development/procurement to operations. Organizations need well-defined, standardized processes and tools to enable each group – data scientists, compliance professionals, organizational leaders – to oversee the AI portfolio.

An organization’s AI is a reflection of the organization’s leaders.  Will your AI be a hot mess of AI sprawl?  Leaders can only manage the risk from AI sprawl if they first admit a lack of AI oversight.  Only then can organizational leaders implement processes and adopt tools to manage your AI’s risk and better serve your organization, clients, and customers.


Previous
Previous

How to Build AI That Society Wants and Needs

Next
Next

Aligning Responsible AI With The White House Executive Order On Ethical AI