Data Breach Incidents

Data Breach Incidents

Don't assume a breach has happened before an investigation has been done.

Editor's Note: While reports of university network hacking are becoming more prevalent, the reality is not always what it seems. In the past year, a number of schools have determined that hacker-related incidents they thought had happened in fact had not. What follows is one such incident, a cautionary tale that ultimately saved untold time and expense. The names, as they say, have been changed to protect the innocent.

WHEN THE NEWS CAME from the IT team to the university's general counsel, it was devastating. A computer worm had been identified in one of the university's financial servers, and apparently the worm had discovered more than a quarter million credit card records. Even worse, the team had determined that the records were not encrypted, despite the claims made by the software vendor.

As a major university with students from almost every state, the university understood that the potential cost of issuing notifications under 40 different data loss notification laws and providing basic remediation services could easily (and almost immediately) exceed $1 million.

But rather than panic, university officials had a plan. With wisdom gleaned from earlier incidents, they understood that fast, sure-footed action was needed and knew that a number of organizations would have to be involved.

Within hours of learning of the incident, a conference call was convened, chaired by the general counsel (who was involved in order to deal with the complicated reporting laws, as well as potential individual or class-action litigation). The call included senior management, the CFO's representative, several IT representatives, and the university's public relations team. The group then reached out to computer forensic specialists to serve as an independent source of technical assistance and, if needed, to provide support for the university's remediation efforts.

The cost of issuing data loss notifications and providing remediation services could easily exceed $1 million.

A computer forensic investigation was launched immediately to determine, based on objective evidence, exactly what had happened and to understand why unencrypted credit card data was on the server. At the same time, a task group began to plan how to carry out notification rapidly and effectively, should it be required

The technical investigation involved working closely with the university's technical staff to gather the evidence from various machines. The key was to gather the data in a forensically sound way. That is, the procedures used to collect the various files, logs, and so on, had to be done in a way that could permit the data to be entered into evidence in any future court proceedings. If forensic accuracy were lost or a chain of custody broken, the evidence needed to defend the university's interests could be rendered useless.

Some of the data collection was done by the university's IT staff . For other particularly sensitive data, the original hard drives were sent to one of our laboratories for analysis, allowing day-to-day processing to continue on a backup server.

Information collected included the contents of the fi le server, which housed financial data, as well as internet, proxy, and other log fi les. Because of the type of routers and network infrastructure in place, investigators were also able to capture what is called "Netflow" data, which documented very detailed internal systems processes.

Once the data reached the computer forensic lab, engineers started piecing it all together and time-correlating the data on the various logs. Within hours, the forensic engineering team observed a pattern emerging, a pattern which determined the following:

- The worm entered the system when someone using the financial system used it to run a web browser and surf to a website that had an infected page. Accessing the page allowed the worm to download.

- The worm exploited a security weakness that had not been fixed with a security patch that was, in fact, readily available.

- Once the worm activated within the server, it sought and found what it thought were credit card records.

- The software in use by the university did, in fact, encrypt the credit card data on the main fi les, but it also maintained an unencrypted copy (unbeknownst to users of the system). Unfortunately, instead of being kept for a limited period of time, the copy had every credit card transaction going back for years.

- Upon finding the records, the worm called out to a compromised server at an Asia-based university and downloaded additional code, known as a "payload." In this case, the payload was a credit card parsing engine, which is software specifically designed to find and prepare credit card data for transmission to hackers.

- The worm assembled the data from a quarter of a million credit card records and was prepared to send it out to the hackers when it encountered a problem. By delving through extreme detail provided in the logs, investigators could see the worm's attempts to connect over the internet to a series of delivery addresses. Even better, they could tell that it was unable to establish a successful connection due to internal errors and the fact that some of the delivery addresses were no longer in operation. Therefore, the credit card data never actually left the university's server.

While the forensic analysis was being performed, the university and specialists in data breach remediation began to discuss victim notification, if the forensics found that a breach had actually occurred.

A primary challenge was to navigate the different states' varying notification requirements. Some have specific language parameters; others require multiple notices to designated agencies. The data breach remediation team was able to work with the university's counsel to carve out which requirements would have to be met, and under what time frames required by those laws. Then the logistics associated with validating cardholder information, identifying deceased cardholders, and producing and mailing hundreds of thousands of letters-as well as the related administrative issues-were used to develop a response action plan, which, fortunately, did not have to be used in this case.

While issues such as unencrypted data and inappropriate site surfing had to be addressed, the darker cloud had lifted.

Another issue to be considered was how best to manage questions that would surface from the notification. Experience from handling hundreds of data breach cases over the years indicates that people who receive notification of a breach often want to talk with someone personally. Fielding such a range of calls and reactions is not an undertaking for unprepared staff .

Individuals whose data has been compromised typically need reassurance in addition to information about the incident and the solution the college or university has chosen. Callers must be treated professionally, sensitively, and uniformly. Institutional leaders must be certain that promises are not made that conflict with the remediation plan and that callers are not given inaccurate information.

For this reason, many, if not most, organizations victimized by data breaches choose not to direct callers to their own phones. Rather, they opt for a dedicated customer care center, where trained professionals with a successful track record are ready to calm and clarify. When a care center specialist using sensitive yet targeted discussion determines a caller may actually have been victimized, the call is immediately relayed to a licensed investigator. This triage enables the team to provide efficient and effective service to all callers.

Ultimately, the computer forensic team was able to advise the university team that while there were issues to be addressed- unencrypted data, absent security patches, and inappropriate site-surfing-the darker cloud had lifted. There was no need to notify either the 250,000 credit-card holders or anyone else, since no breach had occurred. The university dodged the bullet, although not by much.

The lessons learned in all these cases lead us to make a few recommendations:

- Don't assume that it can't happen to you. In today's environment, the reality is that it can-and may.

- Put a plan in place so that when the crisis hits you won't be trying to figure out what to do. Part of this involves identifying the resources you will want available.

- Establish a working group that will have the authority to permit necessary and required work. From a technical viewpoint, you may want to have your pre-selected network forensic resources spend a little time with your technology staff to determine what logs should be kept, for how long, and how they should be secured if an incident is suspected.

Even in the best of circumstances, a potential data breach is traumatic. But with proper planning, the incident can be rapidly analyzed, understood, and worked through in a systematic manner.

Alan Brill is senior managing director of Kroll Ontrack, a firm that provides technology-driven services and software to help recover, search, analyze, and produce data efficiently and cost effectively. Jim Leonard is senior client executive for Kroll.


Advertisement