recommended reading

Nearly a quarter of dot-gov domains don't work, analysis finds

An unofficial analysis of the roughly 1,800 top-level federal Web domains shows nearly a quarter of them are now unreachable.

That may mean those sites have been shut down or that their content has been consolidated into larger sites in accordance with a White House plan to drastically cut the federal Web presence over the coming year, said Benjamin Balter, a new technology fellow at the Federal Communications Commission and graduate student at The George Washington University who designed the analysis tool as a personal project.

In other cases, it could mean the sites were temporarily down during the search or that the collection system made an error, Balter said. He stressed that his analysis tool, which also gathers information about sites' content management systems and the sophistication of their Internet protocol addresses, has not been rigorously tested and that the results are by no means 100 percent accurate.

"This [data set] is nothing to bet the house on," he said, "but, it should give a good general picture of where things are."

Agencies have acknowledged shuttering some sites as part of the website cutting initiative, but so far there's been no central list of closed or consolidated government sites.

Balter said he developed the Web analysis tool out of "idle curiosity" and when he saw the list of top-level dot-gov domains it was a "no brainer" to try it out.

Though far from an official endorsement of his findings, White House New Media Director Macon Phillips Tweeted Balter's results approvingly.

About 70 percent of the dot-gov domains have no detectable content management system or may be using a custom-built system, according to Balter's analysis, and more than one-tenth of the sites are unreachable without typing the sites' "www" prefix into the address bar.

For simple, run-of-the-mill websites, requiring the "www" prefix typically means the sites are extremely old or unsophisticated. In the case of higher traffic sites such as and it may mean the sites are a complex mesh of public and private information, which makes modernizing the addresses more complicated, Balter said.

Balter's data show only a handful of the sites are hosted in a public cloud service, which allows website owners to save money on storage and to more easily handle surges in use.

About 200 of the sites are using some form of Web analytics to track what sections of the site are most popular, the data said.

Only nine of the sites appear to be fully compliant with Internet protocol version 6, or IPv6, standards for how the site transfers information, according to the data. The federal government has been working to transition to IPv6 for more than half a decade. Agencies are required to have fully transferred public facing content on their sites by the end of 2012 and to have moved internal content by 2014.

The federal website cutting initiative is aimed at saving money by consolidating sites into a few uniform architectures and content management systems, and at raising the overall standards of government websites, which are often disorganized, slow and clunky. The project is modeled, in part, on an ongoing British government effort that has cut or consolidated about 75 percent of the country's 2,000 websites over five years.

Threatwatch Alert

Thousands of cyber attacks occur each day

See the latest threats


Close [ x ] More from Nextgov

Thank you for subscribing to newsletters from
We think these reports might interest you:

  • Data-Centric Security vs. Database-Level Security

    Database-level encryption had its origins in the 1990s and early 2000s in response to very basic risks which largely revolved around the theft of servers, backup tapes and other physical-layer assets. As noted in Verizon’s 2014, Data Breach Investigations Report (DBIR)1, threats today are far more advanced and dangerous.

  • Featured Content from RSA Conference: Dissed by NIST

    Learn more about the latest draft of the U.S. National Institute of Standards and Technology guidance document on authentication and lifecycle management.

  • PIV- I And Multifactor Authentication: The Best Defense for Federal Government Contractors

    This white paper explores NIST SP 800-171 and why compliance is critical to federal government contractors, especially those that work with the Department of Defense, as well as how leveraging PIV-I credentialing with multifactor authentication can be used as a defense against cyberattacks

  • Toward A More Innovative Government

    This research study aims to understand how state and local leaders regard their agency’s innovation efforts and what they are doing to overcome the challenges they face in successfully implementing these efforts.

  • From Volume to Value: UK’s NHS Digital Provides U.S. Healthcare Agencies A Roadmap For Value-Based Payment Models

    The U.S. healthcare industry is rapidly moving away from traditional fee-for-service models and towards value-based purchasing that reimburses physicians for quality of care in place of frequency of care.

  • GBC Flash Poll: Is Your Agency Safe?

    Federal leaders weigh in on the state of information security


When you download a report, your information may be shared with the underwriters of that document.