Welcome to Black Hat 2014. Are you ready? Ready for the sessions? Ready for the parties? Ready for the unspeakable, stifling, death-from-above heat? Ready for the late-night alcohol-fueled conversations about what’s next in our industry?
I am. I’m ready. Well I’m almost ready. It’s been awhile since I’ve seen a wave of security community predictions so let’s get a fresh one going. And, I’ll back it up with a $1000 Best Buy gift card to one randomly selected prediction.
Here’s how to enter:
Here’s my entry (regrettably I’m not eligible):
My information security technology gizmo will fix your #APT problems immediately, completely, and forever. #SecurityFuturist
That’s it. Travel safe to Vegas, be safe while you’re there, and come visit our booth to say hello, chat about the threat landscape, steal a t-shirt and maybe offer your predictions in person.
In part 1 of this series I discussed the use of digital forensics as a legitimate science in court for expert witnesses. In this installment of the series I will introduce some research in this article to help drive positive change.
I believe using scientific methods, such as The Supreme Court of the United States (SCOTUS) decided was minimally required in the outcome of the Daubert case, can lead to statistical outcomes of fairer expert answers rather than just the typical three answers of "Yes", "No", or "That's just how computers work" as a conclusion. Introducing the scientific method and statistically based expert answers leaves room for weight between a probability of 0 and 1. A judge or layman jury can use that weight from the statistical outcome for a more informed decision when comparing it to other weights from other tests in the case. It is well known that other forensic sciences such as fingerprinting, drug testing, forensic pathology, DNA testing, and so on use probabilities, so it would not be a great leap to apply it to our newer sciences. Furthermore, in a number of instances other methodologies even use a rate of error which is discussed in the Daubert case. I have yet to personally see others introduce numerical statistics on the record for digital forensics, let alone a rate of error. That needs to change.
This example is a very common question I am asked by clients on nearly any project, especially data loss prevention scenarios and desperately needs to be researched using scientific methods. I wanted to know if I could detect if computer media was wiped. I would also like to know what program was used to wipe the data, if possible. Imagine a situation where an internal employee "went bad" and copied information to a USB memory stick and walked out with the company's crown jewels. The data could be worth billions of dollars and the victim company would most likely litigate. If the same USB memory stick was turned over for examination, would it be possible to detect data wiping? You cannot see what programs were installed on the computer(s) the memory stick was plugged into (because computer(s) were not produced); therefore you do not know if there was a wiping program present. There are many wiping tools available on the internet. It would be near impossible to download every one and try to write a signature for each. That is similar to writing static signatures for every piece of malware which is a nightmare.
Posted by ThreatGeek at 10:00 AM in Computer Forensics, cybercrime , intelligent network forensics, Keith Jones, network analysis and visibility , network forensics, network forensics tools | Permalink | Comments (0)
Tags: computer forensics, expert witnesses, network forensics, scientific methodology
It’s 4 AM and the alarm reminds me that I’ve been awake for an hour or more after grabbing a couple hours of shuteye sometime after midnight. The realization of a 45-minute drive, with a stop off at the office to pick up equipment, breaks my intense contemplation of what it was like to have regular hours. I meet my fellow investigator at the office and we head to the train station. Some eight hours prior, we learned that a company had been hit with a very nasty malware attack sometime in the previous few days. The story was familiar and both I and my compadre could relate the basic details without even hearing them.
A number of employees received a phishing email designed to look like benign financial communications. The emails contained attachments purporting to be financial documents and were designed to look like they arrived from an innocuous sender. The emails were sophisticated and convincing enough to pass a quick skeptical glance and several employees opened the attachments. As it turned out, executed was probably a better description of what was done with the attachments. One of the employees was using a workstation vulnerable to the malware attachments and the malware did its job which resulted in a large number of important documents being rendered inaccessible. Therefore, the client decided to bring in outside consultation which brought me to rolling out of the rack at 0400.
At 6:30 AM, the sun is finally up as the train rolls out of the station. My partner and I luck out and get a decent workspace at the front of a car. We decide to make the most of the ride by going over what we know about the incident. Luckily, my MiFi works relatively well on the train. The client’s description of the incident provides plenty of fodder for research. As we discuss the details, I have to keep the investigator in me at bay. My instinct is to drop into the client site and deep dive into technical details in an attempt to fully understand the malicious activity and associated malware. Instead, the more practical approach is to determine what questions to ask and prepare to listen to what the client describes, and perhaps more importantly, listen for the outcome the client desires. This preparation time is crucial as it is nigh on impossible for my team to anticipate every type of incident and have a checklist or process guide to follow in every case. This particular incident involved crimeware.
Posted by ThreatGeek at 06:30 AM in content inspection, cybercrime , data breach prevention, data breaches , data leakage prevention, data loss prevention, data theft prevention, David Gilbert, incident response, intelligent network forensics, intrusion detection , intrusion detection and prevention systems , intrusion prevention appliance, network forensics, network forensics tools, network monitoring , threat intelligence | Permalink | Comments (0)
Tags: dataheldhostage, incident response, intrusion investigation, network forensics
I have a friend who is an executive in a small to medium sized company that operates in the technology sector and has a great deal of intellectual property invested in their software. They are very good at what they do and have an international presence. We have spoken at length about their IT security posture and what precautions they have taken to safeguard their network and software. This company is now in the process of being acquired. While most of the work through this acquisition is likely to focus on employee retention and payouts, this is a perilous time for both companies. As soon as they issue the inevitable press release, threat actors will start probing their networks -if they are not already there- looking for vulnerabilities to get in.
We have seen time and again during mergers and acquisitions, little to no thought is given to network security. Things like connectivity, email servers, file shares, etc. are all given priority over installing the proper safeguards. Given the great number of unknowns when bringing two companies together, it is imperative to slow down and ask some basic questions:
This is not a simple, get our IT guys together and figure it out exercise. There are likely two different sets of policies and organizational cultures coming together. It will take time to sort out these differences and unify the organizations. This is the perfect time for both IT teams to identify security gaps in their networks.
Posted by ThreatGeek at 10:00 AM in advanced threats, content inspection, content monitoring, content protection, cyber threat security, data breach prevention, data leakage prevention, data loss prevention, data theft prevention, incident response, information protection, John Laycock, network defense, network monitoring , situational awareness | Permalink | Comments (0)
Tags: incident response, M&A, mergers and acquisitions, network defense, network perimeter, network security, network segmentation, patching, sql injection
In part 1 of this series I focused on the use of the scientific method in computer forensic expert testimony. Now I would like to use a real life example of it in action. I have successfully used an example similar to this in court as an expert witness in digital forensics (the details have been changed to protect the parties involved). Theory is great, but seeing real results come alive before your very eyes is much more useful and persuasive so I hope it helps convince you that this is a really important topic that we; the industry’s practitioners, voice, and leaders, must embrace and use with confidence.
When choosing a controversial subject in computer forensics where this type of analysis could help, I believe determining the true time of events in a case is an aspect that is usually disagreed upon vigorously by opposing parties and the perfect choice for this article. If your expert opinion is that a document was signed at noon on a certain day, but the opposing party says you do not have enough information to present that opinion – it may be the right time to think about the problem scientifically, using a scientific methodology, statistics, and reproducible results as the Daubert decision requires.
In our example, imagine the scenario happened many years prior so key computers no longer exist and other computers exist but have been heavily upgraded – losing all of your “smoking gun” information. Luckily, anticipating litigation, some key logs were saved along with the documents to be physically signed by a human being. Of course a full forensic duplication was not acquired because that would make our job too easy, but the relevant files were instead moved to a different piece of media which destroyed the important file system metadata associated with the files while still keeping the internal file content intact.