Presented by GAI | NVIDIA
Artificial intelligence can help agencies better serve their constituents in a medley of ways, ranging from increasing citizen safety to helping first responders during environmental disasters.
While AI and machine learning have significantly impacted businesses in the private sector, the federal government has yet to adopt AI wholesale, with only 25% of agencies claiming to use AI solutions, according to a recent study published by SAS. As AI in the private sector drives innovation and revenues, one may wonder: Why is it that so many federal agencies are dragging their feet when it comes to adopting AI?
While the public sector certainly has its own set of challenges when it comes to AI adoption, according to Tony Paikeday, senior director of artificial intelligence systems at NVIDIA, some of the problems the federal government grapples with are similar to ones that private companies have previously addressed, including:
- Legacy hardware infrastructures, which are not optimized to handle AI computational demands.
- The complexity of integrating an IT platform that includes AI computing, high-performance storage and networking.
- Lack of skills in taking AI from prototype to production, and the organizational gap that separates data scientists and IT.
Combined, these create an environment that is by and large “not optimized for the unique demands of AI development,” says Paikeday. “AI requires massive computational power that can process the equally massive parallel architecture of today’s AI and deep neural networks ... AI development requires high-performance computing, storage and networking interconnects with ultra-low latency.”
These challenges ultimately delay an agency’s adoption of AI-based solutions. Instead of attempting to address these challenges on their own, agencies should look for a partner like GAI, who has been NVIDIA’s partner of the year offering end-to-end IT solutions and deployments of AI infrastructure. However, amid economic turbulence, a national health crisis, and other timely priorities, AI may not seem like they belong at the top of the list for agencies.
However, AI solutions could actually help deliver many of the mission-critical services agencies need to tackle these priorities.
“We see AI increasing drug safety through the application of deep learning on labeling,” says Paikeday, expounding on the types of solutions agencies might make use of to grapple with today’s largest challenges. “In clinical research, we see AI supporting trials, and in the field, we see AI enabling predictive maintenance ensuring maximized uptime for assets and fleets.”
Laying the Groundwork for AI Solutions
Before agencies make use of AI, Paikeday recommends leadership take the time to consolidate and create a unified implementation strategy, for AI, across the entire organization. NVIDIA and GAI have worked together with various federal agencies standing up centers of excellence that help guide the unified adoption strategy.
“When you centralize AI development, you’ll immediately see benefits across people, process and platform. You’ll consolidate expertise and [be able to] share best practices and solve your AI talent problem by growing expertise from within,” says Paikeday. “It will also eliminate stranded infrastructure silos like this shadow AI problem that occurs when developers spawn their own platforms outside of IT, wasting CapEx and OpEx.
For agencies looking to start the centralization process, start with proven infrastructure that is purpose-built for the unique demands of AI development instead of relying on the status quo. The right platform is designed to strike the optimal balance of compute, storage and networking, includes tools that streamline AI workflow across data science and IT teams, and is supported by partners who are AI-fluent. When leadership takes the time to start with a platform that brings experts, technology and IT together, the agency will benefit as a whole. These benefits can take the form of increased efficiencies and savings across the organization, as developers across the agency now subscribe to a singular unified platform as opposed to numerous siloed AI platforms that potentially compete for budget and support.
Getting Technical: Infrastructure Challenges and Solutions
Naturally, when agencies begin to delve into the technicalities and requirements of AI infrastructure, the conversation may drift toward the cloud as a de facto approach common to supporting most IT workloads. And although many organizations might consider themselves “cloud-first” or “cloud-only,” Paikeday calls on agencies to approach AI from a hybrid perspective.
“It’s important to recognize that both cloud and on-premises infrastructure have a useful place in the AI development journey,” Paikeday notes. “Cloud is not the hammer for every AI nail. It’s a great way to engage in early productive experimentation ... but eventually through ongoing iteration, we’ve found that AI models start to grow more and more complex in pursuit of delivering higher predictive accuracy, while datasets fueling AI development begin to exponentially increase in size.”
If agencies are not careful, this growth can lead to a common pitfall associated with what Paikeday calls “data gravity”: Data gravity is the tendency of large datasets to attract applications and resources towards them, since they become increasingly time consuming and costly to move and store in places other than where they’re created. As an agency spends more time and expense moving AI datasets to the cloud, they’re essentially fighting the pull of data gravity, incurring skyrocketing costs associated with hosting, replicating and accessing the information on a cyclical, on-going basis.
“If you try to fight data gravity you’re going to lose, and by losing, I mean you’re just going to spend more time and money pushing very large data sets from where they’re generated to where your computing is,” Paikeday explains.
To safeguard against sprawling data gravity costs and the inevitable speedbump it imposes on the AI development cycle, agencies should seek to implement hybrid cloud plans before implementing any AI measures. Hybrid infrastructures that combine on-prem and cloud resources are the leading approach when it comes to AI computing. In comparison to its public or private cloud counterparts, a hybrid approach allows agencies to take advantage of the fixed- cost found in on-prem infrastructure sized to meet the on-going, steady-state demands of the organization, while also harnessing the agility of modern cloud to support temporal or seasonal requirements that exceed that capacity or where the workload is associated with early experimentation. This hybridized approach follows the useful mantra “own the base, rent the spike”
When agencies can leverage on-prem infrastructure — like solutions designed by GAI featuring NVIDIA’s DGX System which features eight NVIDIA GPUs and two 2nd Gen AMD EPYC™ processors — data scientists are able to quickly and efficiently train models without fear of overrunning their budgets, says Paikeday. In turn, it allows these data science “artisans” to build better, more creative applications with the highest predictive accuracy in the shortest timeframe possible.
“There’s a benefit in a fixed cost infrastructure that supports rapid iteration at the lowest cost per training run,” said Paikeday. “For this reason, I think many agent agencies and organizations that might have been cloud-first are now realizing the importance of hybrid infrastructure for AI.”
Learn more about solutions designed by GAI featuring NVIDIA DGX A100, powered by eight NVIDIA GPUs and two 2nd Gen AMD EPYC Processors and How to Help Your Agency Supercharge its Efforts to Deliver Mission-Critical Services.
This content is made possible by our sponsor(s) GAI and NVIDIA; It is not written by and does not necessarily reflect the views of GovExec's editorial staff.
NEXT STORY: Improving City Traffic Safety Through 5G