recommended reading


How the Digital Transformation Will Upend the Pentagon

By Nick Michaelides // April 29, 2016


Nick Michaelides is a U.S. federal leader at Cisco.

Take a look around and you’ll see it affecting everything. What is it? Digitization.

Drones flying overhead, providing new perspectives and creating new video data. Smart vehicles parking themselves and autonomous cars cruising the highways without drivers. Wearables recording exercise and calories to improve activity level and health. Messaging and video conferencing tools connecting individuals and boardrooms instantly despite being oceans apart. Students in classrooms, coffee shops and living rooms sharing notes and working on assignments in a collaborative, online environment.

All of these are examples of the power of digitization: bringing people, processes, data and things together to create fully connected environments to improve and simplify information sharing, collaboration, operations and decision-making.

Generating these outcomes is important for meeting the needs of any business or government organization, but there are few places where it is more critical than in the battle space.

The Defense Department is one of the most complex and widespread organizations in the world, but in today’s battle space, embracing digitization helps pull together all aspects of defense. Through the latest technologies and solutions, digitization enables joint operations on the battlefield and facilitates real-time...

When it Comes to Engagement With Citizens, the Government Is Finally Paying Attention

By Teresa A. Weipert // April 22, 2016


Teresa A. Weipert is senior vice president at Sutherland Government Solutions.

There is an old saying in retail marketing that “the customer is always right.”

Unfortunately, over the past few decades it has been hard for the public sector to follow that adage. The acceleration of technological changes in how the private sector delivers goods and services has raised expectations among citizens that government agencies can do the same – or even find ways to do better.

In order to meet these expectations, it has become clear that government agencies must adapt to a cultural shift.

They must adopt a new citizen engagement strategy involving technology, policy, programs, best practices, intra/interagency collaboration, customer-friendly interactions and mechanisms for feedback on service delivery.

When it comes to improving customer experience, the federal government is finally paying attention.

For instance, there is a provision included in the fiscal 2016 spending bill directing the Office of Management and Budget to report on agencies’ progress in developing customer service standards and incorporating them into performance plans.

This new provision states more needs to be done to improve the services the government provides, whether it is citizens trying to use, taxpayers calling the IRS...

Developers: 6 Steps to Revamp Your Apps

By Chris Steel // April 21, 2016


Chris Steel is chief solutions architect for Software AG Government Solutions.

As applications age and gather more users, common issues with speed and reliability start to creep up. Users losing patience with slow application response times can be pervasive if the requested application data is coming all the way from a back-end database or Hadoop cluster.

And when too many application users request data all at once, the back-end database can get bogged down, causing timeouts and leading to unwelcomed reliability issues.

So, what’s an IT department to do?

Many are eagerly embracing the benefits of in-memory computing for the low-latency access to terabytes of data at extremely high speeds.

Although these features are appealing, an application’s in-memory data can easily become inconsistent and unpredictable if not architected properly. From disk-based to memory-based application architectures, below are six areas of consideration:

Predictable, Extremely Low Latency

Working with data in memory is orders of magnitude faster than moving it over a network or getting it from a disk. This speed advantage is critical for real-time data processing at the scale of big data.

However, Java garbage collection is an Achilles’ heel when it comes to using large amounts of...

Why the Nation Needs to Use Technology to Up the Ante on Immigration Screening

By Ron Collins // April 19, 2016

Immigrants from El Salvador and Guatemala who entered the country illegally board a bus after they were released from a family detention center in San Antonio.
Immigrants from El Salvador and Guatemala who entered the country illegally board a bus after they were released from a family detention center in San Antonio. // Eric Gay/AP File Photo

Ron Collins is CFO and COO at Exiger, a global regulatory and financial crime, risk and compliance firm.

One of the great, ironic twists of 21st-century democracy is that U.S. banks and other financial institutions now have far more sophisticated screening technology in place to vet their customers than the U.S. government has to screen immigrants moving into the country. It doesn’t need to be this way.

Thanks to the last decade of financial reforms and intense government scrutiny of fraud, money laundering and terror financing in the financial system, U.S. banking institutions have developed smart technology for identifying red flags in their relationships with partners, vendors and customers around the globe. 

The push to comply with legal requirements like the Foreign Corrupt Practices Act and Know Your Customer requirements has driven a corresponding leap forward in new screening technology development. With today’s artificial intelligence tools, it is now relatively inexpensive and efficient to cross-reference disparate data sets to identify complex relationships and affiliations that could signal a bad risk.

Despite catalyzing this growth through an onslaught of regulation, however, the U.S. government has not used any of this technology to its own advantage...

Turning Damage Control into Digital Modernization

By Jeffrey Neal and Peter Wilson // April 1, 2016

Pavel Ignatov/

Jeff Neal is a senior vice president for ICF International and founder of the blog, Before coming to ICF, Neal was the chief human capital officer at the Homeland Security Department and the chief human resources officer at the Defense Logistics Agency.

Peter Wilson has more than 20 years of consulting experience within the U.S. federal government and in the areas of healthcare, Fortune 500, and nonprofits. He provides public- and private-sector thought leadership in technology, program and project management for ICF International. Pete was named a 2014 "rising star" by Federal Computer Week magazine.

Each year, the federal government spends approximately $37 billion to maintain the existing IT portfolio, and each year, costs to maintain and defend them against cyberthreats continue to increase.

President Obama’s recently published 2017 budget puts forward a $3.1 billion IT Modernization Fund to help “retire, replace, or modernize the federal government’s most at-risk legacy IT systems.”

The purpose of the fund is to help stimulate modernization of systems that are both high priority and high risk… and federal IT systems are at risk. Federal News Radio’s Jason Miller reported the draft policy was circulating among civilian agencies...