The U.S. Government's Cybersecurity Is a Total ShitshowS

Today, a report from the Homeland Security and Governmental Affairs Minority Committee offered an overview of the fed's current state of cybersecurity. And how is the government with which we entrust our most sensitive and private information looking? In short—bad. Very, very bad.

It's no secret that the federal government isn't exactly what you might call competent when it comes to, well, anything having to do with technology. But according to the new report, the full extent to which we really have no idea what the hell we're doing is more than a little concerning.

According to The Washington Post:

The report draws on previous work by agency inspectors general and the Government Accountability Office to paint a broader picture of chronic dysfunction, citing repeated failures by federal officials to perform the unglamorous work of information security. That includes installing security patches, updating anti-virus software, communicating on secure networks and requiring strong passwords. A common password on federal systems, the report found, is "password."

So just how bad is it? We've picked out some of the more troubling revelations here, but you can read the report in its entirety down below. Brace yourself—it ain't pretty.

1. Shitty passwords

Over at the Department of Homeland Security, FEMA's Enterprise Data Warehouse boasts "accounts protected by 'default' passwords, and improperly configured password controls."

The IRS isn't doing much better, either:

In March 2013, GAO [Government Accountability Office] reported that IRS allowed its employees to use passwords that "could be easily guessed." Examples of easily-guessed passwords are a person's username or real name, the word "password," the agency's name, or simple keyboard patterns (e.g., "qwerty"), according to the National Institute of Standards and Technology.

This isn't exactly a new revelation. The GAO has cited the IRS for allowing old, weak passwords in every one of its reports over the past six years.

2. Physically writing down those passwords on furniture

Particularly painful is the Department of Homeland Security's mishandling—to put it lightly—of sensitive information:

Independent auditors physically inspected offices and found passwords written down on desks, sensitive information left exposed, unlocked laptops, even credit card information. To take just one example, weaknesses found in the office of the Chief Information Officer for ICE included 10 passwords written down, 15 FOUO (For Official Use Only) documents left out, three keys, six unlocked laptops — even two credit cards left out.

3. Out-of-date antivirus software

Twelve of the 14 computers that controlled physical access to the Department of Homeland Security had "anti-virus definitions most recently updated in August 2011."

4. Government employees "going rogue" to avoid inept IT guys

Apparently, the Nuclear Regulatory Commission's IT department has such a "perceived ineptitude" that "NRC offices have effectively gone rogue – by buying and deploying their own computers and networks without the knowledge or involvement of the department's so-called IT experts." And these independents systems can, of course, make the problem even worse when officials don't actually know what the system is running on, leading to...

5. Inability to keep track of computers

Since employees are avoiding the IT department by running their own hardware, the Nuclear Regulatory Commission can't track of which laptops are accessing sensitive information.

6. Failure to encrypt sensitive data

Surely all that financial data of yours running through IRS computers is in safe hands, right? Apparently not! According to the report, the IRS either routinely fails to encrypt its data or just does such a horrible job of it that it can be easily decoded. Which, as we unfortunately know, hackers have no problem doing.

7. Refusing to install crucial software updates and patches

In March 2012, the IRS found that it had around 7,300 "potential vulnerabilities on its computers." In 2011, about a third of all computers at the IRS were carrying software with unpatched, critical vulnerabilities. The IRS said it would have all the patches installed in 72 hours—it actually took about 55 days.

Just as recently as last September, the IRS still had yet to implement "a process to ensure timely and secure installation of software patches," according to the Treasury Inspector General for Tax Administration.

8. Minimal, if any, server protection

If you're looking to access the Department of Education's computer system—which "holds and manages $948 billion in student loans made to more than 30 million borrowers"—you won't find too much standing in your way. In 2011, 2012, and 2013, auditors were able to connect a "rogue" computer to the Education Department's network without being detected, no hacking necessary. What's more, in 2013, the same test gave auditors access to sensitive data "stored in the department's networked printers."

So yeah, it's pretty grim out there, made all the worse by recent revelations of just how much of your information the government has on hand. Check below to read the full, sobering account:

Fed Cyber Report - Feb 4 2014

[The Washington Post]

Image: Shutterstock/LoloStock