Before I dive in to the meat of this post, let me offer some friendly advice. I'm sitting at the food court adjoining the south conference center here at the Mandalay Bay, and I just saw an attendee plug his device directly into one of those convenient USB sockets provided for charging your gadgets. I guess it's probably safe all things considered, but on principle please don't do this. Not here. Not now.
Where was I? At the food court. Speaking of food, Dan Geer's keynote Wednesday morning was chock-full of food for thought. He proposed nothing less than a set of proposals intended to radically reset the balance between safety and order on the Internet, noting that its very nature prevents us from having much of both. I want you to go watch his speech for yourself, but I will provide a quick list of the ten areas he discussed today. Do watch the speech, as my gross oversimplifications will fail to do justice to Geer's ideas.
1) Enforce CDC style mandatory reporting rules for cybersecurity failures exceeding a (yet to be negotiated) severity threshold.
2) Net neutrality: Geer contended that if an ISP makes itself privy to the content, source or destination of the traffic flowing over its network, it necessarily makes itself responsible (and liable) for what it learns about this traffic. Or ISPs can be common (i.e., neutral) carriers. But not, he claimed, both.
Welcome to Black Hat 2014. Are you ready? Ready for the sessions? Ready for the parties? Ready for the unspeakable, stifling, death-from-above heat? Ready for the late-night alcohol-fueled conversations about what’s next in our industry?
I am. I’m ready. Well I’m almost ready. It’s been awhile since I’ve seen a wave of security community predictions so let’s get a fresh one going. And, I’ll back it up with a $1000 Best Buy gift card to one randomly selected prediction.
Here’s how to enter:
Follow @FidSecSys and tweet your predictions for the next twelve months using the #SecurityFuturist hashtag by 5:00 p.m. PT on August 4, 2014. Don’t have a Twitter account? That’s weird. I guess you could leave a comment below with your name and contact info.
Your tweet may be an authentic prediction, and it may be snark, because, well, this is the Internet. If you like, aim for a perfect exemplar of Poe’s Law. Feel free to submit as many predictions as you want but each one has to be unique.
We’ll randomly select one of the predictions at 5:00 p.m. PT on August 4 and announce the lucky winner on Twitter at 9:00 a.m. PT on August 5.
We’ll have the prize ready at our Black Hat booth (#347) but if you’re not attending this year we can ship it out to you following the show.
Here’s my entry (regrettably I’m not eligible):
My information security technology gizmo will fix your #APT problems immediately, completely, and forever. #SecurityFuturist
That’s it. Travel safe to Vegas, be safe while you’re there, and come visit our booth to say hello, chat about the threat landscape, steal a t-shirt and maybe offer your predictions in person.
In part 1 of this series I discussed the use of digital forensics as a legitimate science in court for expert witnesses. In this installment of the series I will introduce some research in this article to help drive positive change.
I believe using scientific methods, such as The Supreme Court of the United States (SCOTUS) decided was minimally required in the outcome of the Daubert case, can lead to statistical outcomes of fairer expert answers rather than just the typical three answers of "Yes", "No", or "That's just how computers work" as a conclusion. Introducing the scientific method and statistically based expert answers leaves room for weight between a probability of 0 and 1. A judge or layman jury can use that weight from the statistical outcome for a more informed decision when comparing it to other weights from other tests in the case. It is well known that other forensic sciences such as fingerprinting, drug testing, forensic pathology, DNA testing, and so on use probabilities, so it would not be a great leap to apply it to our newer sciences. Furthermore, in a number of instances other methodologies even use a rate of error which is discussed in the Daubert case. I have yet to personally see others introduce numerical statistics on the record for digital forensics, let alone a rate of error. That needs to change.
This example is a very common question I am asked by clients on nearly any project, especially data loss prevention scenarios and desperately needs to be researched using scientific methods. I wanted to know if I could detect if computer media was wiped. I would also like to know what program was used to wipe the data, if possible. Imagine a situation where an internal employee "went bad" and copied information to a USB memory stick and walked out with the company's crown jewels. The data could be worth billions of dollars and the victim company would most likely litigate. If the same USB memory stick was turned over for examination, would it be possible to detect data wiping? You cannot see what programs were installed on the computer(s) the memory stick was plugged into (because computer(s) were not produced); therefore you do not know if there was a wiping program present. There are many wiping tools available on the internet. It would be near impossible to download every one and try to write a signature for each. That is similar to writing static signatures for every piece of malware which is a nightmare.