News

ARCHIVES

Collaborate or Perish: What the Internet of Everything Teaches Us About Public-Private Partnerships

By David Bray // March 9, 2015

ruskpp/Shutterstock.com

David A. Bray, chief information officer at the Federal Communications Commission, will soon return from a five-week Eisenhower Fellowship overseas, traveling in a personal capacity to meet with industry and government leaders from Taiwan and Australia regarding cyber strategies for the Internet of Everything. His views are strictly personal and represent solely his own in an Eisenhower Fellowship capacity.

The pace of global technology change is accelerating, and with it all of us will face opportunities and challenges that span sectors at a similarly accelerating rate.

The future of the U.S. and of the world will require collaborations across sectors. For democratic nations, such collaborations will need to adapt how we do public service. Public service includes us all, individual members of the public as the top of a triangle and the private and public sectors as the base. If we choose to, we can pursue new collaborations across sectors to produce a future with more beneficial choices, options and freedoms for everyone.

It is solely as a member of the public that I’m writing this post, as I have been fortunate to have an opportunity to step outside of my day-to-day professional role and spend five weeks ...

Former WH Deputy CIO Lands at Fortune 500 Medical Device Firm

By Nextgov Staff // March 2, 2015

Orhan Cam/Shutterstock.com

The former deputy chief information officer of the White House executive office is joining medical device giant Stryker as the company’s chief information security officer.

Alissa Johnson had served as deputy CIO for the Office of Administration in the White House’s Executive Office of the President since March 2012.

Stryker Corp. is a Kalamazoo, Michigan-based Fortune 500 medical technology company with $9 billion in revenue last year. In her new role, Johnson will be responsible for overseeing the company’s information security efforts.

Johnson is a National Security Agency-certified cryptologic engineer, who previously served in positions at defense agencies and intelligence agencies.

“It will be great to be able to transfer those skills to health care -- an industry that has its own set of cyber challenges,” she said in an email to Nextgov.

Johnson’s former boss at the White House, Karen Britton, left her post in January, joining eManagement, a small IT firm based in Silver Spring, Maryland.  

At the time, Johnson said she too was planning on “transitioning out” as the executive office’s No. 2 IT official.

Unclassified networks at the White House fell victim to a breach last fall. Administration officials later revealed the ...

Supercomputing: Key to Discovery and Competitiveness

By Jane Snowdon // February 24, 2015

The Dutch petascale national supercomputer,
The Dutch petascale national supercomputer, // Flickr user Dennis van Zuijlekom

Dr. Jane L. Snowdon is chief innovation officer at IBM U.S. Federal Government.

Countless industries have benefited from advancements in computing technology – from manufacturing, financial services, biotechnology, education and entertainment, to the government and the military. This ongoing push for innovation has led to significant increases in both industrial productivity and efficiency.

In fact, innovations stemming from supercomputing can be found in nearly every facet of our lives. From oil discovery and energy efficiency, to climate modeling, to the introduction of new drugs, to the design of cars and planes – supercomputers have helped displace costlier, less-efficient models – thanks in large part to funding from the federal government.

According to the U.S. Council of Competitiveness’ October 2014 “The Exascale Effect: the Benefits of Supercomputing Investment for U.S. Industry” report, supercomputing is viewed as a cost-effective tool for accelerating the R&D process, and two-thirds of all U.S.-based companies that use supercomputing say “increasing performance of computational models is a matter of competitive survival.”

While engineers and scientists struggled with massive data files in the 1990s in the commercial computing space, supercomputers were being designed to work on files ten thousand times larger. Since, many of the ...

Moving Beyond the PDF Era: 3 Reasons to Embrace Financial Transparency

By Ari Hoffnung // February 20, 2015

Andy Dean Photography/Shutterstock.com

Ari Hoffnung is former NYC deputy comptroller and senior adviser to Socrata.

President Obama’s recently released fiscal year 2016 budget made history, not because of its size – $4 trillion – and not because of the spending priorities or tax proposals it outlined, but because it strongly reflected the values of data-driven government as our nation’s first easily accessible federal budget with “machine-readable” information.

This approach helps bridge the gap between documents, which are typically static and frozen in their format, and data, which may be dynamic and can be open to further processing.

Indeed, the White House’s commitment to open data and financial transparency allows taxpayers to see exactly how their money is being spent, where it’s going, why it’s being allocated and the extent to which the president’s fiscal policies, plans and initiatives will impact Americans.

This form of digital democracy is now spreading across the nation, thanks to governments at all levels and sizes – small and large cities, dense and sparse counties, and the most and least populous states. Democratic and Republican, there is strong bipartisan support for financial transparency, whether it’s in Stutsman County, North Dakota, which has a $23 million ...

In-Memory Computing: What Happens When the Power Goes Out?

By Chris Steel // February 18, 2015

Thinkstock

Chris Steel is chief solutions architect for Software AG Government Solutions, Inc.

While agencies are eagerly looking to find new opportunities for performance improvements and costs reductions, the question on using in-memory computing has definitely changed from an “if” to a “when” sensibility.  

But when faced with big data-sized mission critical applications, new considerations must be addressed when leveraging in-memory. One sweeping key component: persistence.

Also known as a “fast restartability store,” it is quickly becoming essential for the 24x7 requirements of mission critical applications and big data projects.  

In-memory computing is a relatively new trend, though the concept is well understood and is as old as the dawn of computers: Accessing data from memory is a lot faster than accessing it from disk or over a network.

In-memory computing uses large amounts of RAM to store as much of an application’s data as possible in memory, thereby increasing application performance and cutting cost by reducing the need to scale horizontally.

In the traditional database off-loading use case, static queries from the database are cached in memory on the application server. Subsequent requests for these queries can be returned very quickly, as they are already in memory at the ...