recommended reading

ARCHIVES

Developers: 6 Steps to Revamp Your Apps

By Chris Steel // April 21, 2016

Bloomua/Shutterstock.com

Chris Steel is chief solutions architect for Software AG Government Solutions.

As applications age and gather more users, common issues with speed and reliability start to creep up. Users losing patience with slow application response times can be pervasive if the requested application data is coming all the way from a back-end database or Hadoop cluster.

And when too many application users request data all at once, the back-end database can get bogged down, causing timeouts and leading to unwelcomed reliability issues.

So, what’s an IT department to do?

Many are eagerly embracing the benefits of in-memory computing for the low-latency access to terabytes of data at extremely high speeds.

Although these features are appealing, an application’s in-memory data can easily become inconsistent and unpredictable if not architected properly. From disk-based to memory-based application architectures, below are six areas of consideration:

Predictable, Extremely Low Latency

Working with data in memory is orders of magnitude faster than moving it over a network or getting it from a disk. This speed advantage is critical for real-time data processing at the scale of big data.

However, Java garbage collection is an Achilles’ heel when it comes to using large amounts of...

Why the Nation Needs to Use Technology to Up the Ante on Immigration Screening

By Ron Collins // April 19, 2016

Immigrants from El Salvador and Guatemala who entered the country illegally board a bus after they were released from a family detention center in San Antonio.
Immigrants from El Salvador and Guatemala who entered the country illegally board a bus after they were released from a family detention center in San Antonio. // Eric Gay/AP File Photo

Ron Collins is CFO and COO at Exiger, a global regulatory and financial crime, risk and compliance firm.

One of the great, ironic twists of 21st-century democracy is that U.S. banks and other financial institutions now have far more sophisticated screening technology in place to vet their customers than the U.S. government has to screen immigrants moving into the country. It doesn’t need to be this way.

Thanks to the last decade of financial reforms and intense government scrutiny of fraud, money laundering and terror financing in the financial system, U.S. banking institutions have developed smart technology for identifying red flags in their relationships with partners, vendors and customers around the globe. 

The push to comply with legal requirements like the Foreign Corrupt Practices Act and Know Your Customer requirements has driven a corresponding leap forward in new screening technology development. With today’s artificial intelligence tools, it is now relatively inexpensive and efficient to cross-reference disparate data sets to identify complex relationships and affiliations that could signal a bad risk.

Despite catalyzing this growth through an onslaught of regulation, however, the U.S. government has not used any of this technology to its own advantage...

Turning Damage Control into Digital Modernization

By Jeffrey Neal and Peter Wilson // April 1, 2016

Pavel Ignatov/Shutterstock.com

Jeff Neal is a senior vice president for ICF International and founder of the blog, ChiefHRO.com. Before coming to ICF, Neal was the chief human capital officer at the Homeland Security Department and the chief human resources officer at the Defense Logistics Agency.

Peter Wilson has more than 20 years of consulting experience within the U.S. federal government and in the areas of healthcare, Fortune 500, and nonprofits. He provides public- and private-sector thought leadership in technology, program and project management for ICF International. Pete was named a 2014 "rising star" by Federal Computer Week magazine.

Each year, the federal government spends approximately $37 billion to maintain the existing IT portfolio, and each year, costs to maintain and defend them against cyberthreats continue to increase.

President Obama’s recently published 2017 budget puts forward a $3.1 billion IT Modernization Fund to help “retire, replace, or modernize the federal government’s most at-risk legacy IT systems.”

The purpose of the fund is to help stimulate modernization of systems that are both high priority and high risk… and federal IT systems are at risk. Federal News Radio’s Jason Miller reported the draft policy was circulating among civilian agencies...

5 Tips to Avoid Getting Scammed This Tax Season

By Darren Guccione // March 29, 2016

The Internal Revenue Service headquarters building in Washington, DC.
The Internal Revenue Service headquarters building in Washington, DC. // J. David Ake/AP File Photo

Darren Guccione is CEO and co-founder of Keeper Security.

For many Americans, tax season can be quite stressful as the mid-April deadline quickly approaches. Adding to that stress is the fact that the Internal Revenue Service has had its own troubles this past year with keeping its data and systems secure. 

For me, personally, tax season is particularly intense because I am in the unique position of being both a cybersecurity technologist and a CPA. I know firsthand how important it is for tax information to be locked down and secure as we continue to see IRS scammers improving their game.

According to recent reports, tax scamming will cost the U.S. government $21 billion this year alone through fake refunds and fake IRS agent impersonators. The IRS reports the number of fake IRS scams is up by 400 percent.

In addition, this comes at a time when the Obama administration has a microscope on federal cybersecurity in the wake of major federal breaches, including the Office of Personnel Management. In February, the administration announced a $19 billion investment in cybersecurity and a $3.1 billion revolving fund to help replace aging government systems most vulnerable to cyberattacks.

On top...

New OMB Data Center Effort Can Help Break Agency Silos

By Randy Boggess // March 22, 2016

John Minchillo/AP

Randy Boggess is head of cloud solutions marketing, global portfolio team at Unify.

The new White House policy for optimizing data centers, as covered by Nextgov earlier this month, has for the most part been well received by government and industry. The Office of Management and Budget Data Center Optimization Initiative supersedes the Federal Data Center Consolidation Initiative launched by the OMB in 2010, and is designed to advance efforts beyond the physical closing of data centers to IT optimization.

The OMB draft policy also seeks to provide a framework for achieving data center consolidation and optimization by bringing data center guidance in line with the Federal IT Acquisition Reform Act, better defining the organizational reporting structure between bureau and agency level chief information officers, and providing metrics for data center optimization. 

Among other things, the policy would require agencies to develop annual data center consolidation plans and emphasize a cloud-first and shared services approach. More specifically, the OMB policy lines up favorably to ongoing agency data center consolidation efforts by providing a formal structure for reporting improvements in:

Cost Savings: Data center consolidation (fewer pieces of real estate); maximum server utilization (better use of resources); and improved energy efficiency...