Logins are Hard

by Michael Szul on

No ads, no tracking, and no data collection. Enjoy this article? Buy us a ☕.

54 million (yes, million) people's telecom data, including birth dates and social security numbers were recently acquired when T-Mobile was hacked. Additionally:

In an update issued Friday, the mobile carrier reported that hackers had illegally accessed one or more associated customer names, addresses, dates of birth, phone numbers, IMEIs and IMSIs of 5.3 million current postpaid customers. T-Mobile also said it had identified an additional 667,000 accounts of former customers that were accessed, with customer names, phone numbers, addresses and dates of birth compromised.

The breach, which was first reported Sunday, is one of at least four to hit the mobile carrier since 2015. On this occasion, Vice reported that a seller on an underground forum was offering to sell the customer data for 6 Bitcoin (approximately $277,000). T-Mobile later confirmed it had been the victim of a cyberattack, and has now been investigating how many customers were affected for several days. 1

How could this happen you might ask? Aren't there smart people working in IT at these companies that are there to prevent such things?

With advances in technology being so profound over the last few decades, security has become—and continues to be—an ever important issue for every business, project, or hobby (and even home automation) taking the steps to bring their offerings to the next level. Unfortunately, for those businesses relying on consulting companies to handle their technology needs, security becomes an even bigger issue, as you're placing a core component of your business in the hands of a group that is unfamiliar with your company's core competencies. Even with the universal necessity of security, outsourcing the core of your business can create greater security holes as groups lose insight in the mission and strategy required for those competencies.

Bear in mind that sometimes this doesn't even require outsourcing. In businesses where internal IT is seen as a service instead of a part of the business, you can witness much of the same behavior and side effects.

Putting your trust in an external team to ensure that your technology is secure is a two fold application of trust. You're not only assuming that the group can provide you with said security, but you're also assuming that the company will future-proof your technology from employee-created errors in the future. Many consulting companies do the first, but very few do the latter, and when you're outsourcing one of the core needs of your business, you can't guarantee that that the person placed in charge of your product's/project's/business' security isn't the junior developer that drew the short straw.

How bad can it be? I was once a part of team building web applications for a travel company, and shortly after joining the team, I was enmeshed in a forensic review where this client's web application (one of their largest sources of revenue) had been hacked on three separate occasions with the home page replaced with various versions of a hacker manifesto. On the last occasion, the home page was replaced by a giant red eye and lots of copy talking about the punishment of American infidels (the offenders were actually American, but masquerading as Turkish hackers).

Upon reviewing the servers, it turned out that these "hackers" managed to insert several HTML files into every directory of the web site, taking advantage of the web server's multiple index page setting. Always, always reduce the index page setting to only what should be the actual index file. As we investigated further—trying to determine how these files were placed there—we found no evidence of actual hacking through remote desktop, FTP, security vulnerabilities in the server, etc. Our first inclination was to assume that somebody’s password was stolen, but nobody at the company had such administrative access to the server. That access was only in the hands of the company I just started working for.

Photo from Superuser.com under CC

Further examination of each individual directory in the web server's folder structure revealed an errant ASP page. When accessed, this page sent information about the server back to the browser (and thus the person accessing the page through the web). It also generated those multitudes of index pages that we had found earlier. We had succeeded in finding the culprit, but also succeeded in finding the security breach. As it turned out, the directory that the ASP page resided in was also the upload directory that everyone in the company had access to (from a web site page in an administration section of the web site). I opened up a web browser, surfed over to the page in question, and discovered not only that the page itself could be accessed directly without logging in, but that it also had a Google PageRank of 5. This was back during the Google Toolbar days. Not only was the page being accessed by unwanted people directly, but it could also be found in a standard web search.

At first nobody knew just how the page was indexed. Althought the current developers of the application acknowledged that the page had no inherent login logic on it, they argued that the page was only supposed to be accessed in an admin section of the web site that was already behind the login prompt. There was no way that Google could have indexed it.

As I continued the investigation, the reason behind the search engine indexing became clear. One of the pages on a separate web application owned by the company had a direct link to the upload form instead of the admin login page. This page's content was controlled by the marketing department of the company. This struck a cord with the web developers on the team. They admitted the flaw in their login logic, but concluded that the only reason these security breaches were occurring was because of user error with the company’s employees. Had they not linked to a secure page from one of their marketing pages, this would have never happened.

Sigh.

That was their actual response.

User error isn't just the user's fault. When a company hires consultants to develop their technology, it’s the job of the technology consultants to not just create a high quality user experience that prevents such behavior, but how about also securing the damn application correctly to begin with? Creating a single page as a "login" that takes you to unsecure pages and relying on "thoughts and prayers" to keep Google crawlers at bay isn't exactly what anybody should expect of a consulting company pulling in a couple of million dollars worth of contracts a year. (This company is still is business, by the way).

Thank you, Redditors

This wasn't the first nor the last time I heard people from a company that I work for treat clients as if they were a burden—often pointing fingers when there is a "user error." The reality is that there are always going to be user errors. There are always going to be ways in which the software is used that runs counter to the way it was intended. That's what spawned the whole usability movement to begin with. If a company was that technologically sound they wouldn't have to hire a consulting company in the first place. Pointing fingers and placing blame shouldn't be a core skill set of any developer or company. Solving problems and designing to prevent future ones ought to be good enough.

At the time of this "breach" I was maybe 6 years into my career as a programmer. I still considered myself new to the trade. Maybe I wasn't still a junior developer, but I certainly didn't consider myself a senior developer either. I was just some guy writing code. Everyone at this company had more "experience" than me. To me, it was inconceivable that this team of more experienced developers accidently created this glaring security flaw…

Richard Campbell used to make comments about do you have 12 years of experience or do you have 12 iterations of 1 year of experience? Years sitting behind a keyboard doesn't always increase your expertise and critical thinking. Also, not all programmers received A's in schools. Software might be software, but it isn't magic, and there are still very real people that we know little about that create that software—often on an unrealistic deadline dictated by project managers or accountants.

As software continues to eat the world security becomes paramount, and we're starting to exist in a state where everything is in someone else's cloud… and everything is already hacked.