December 23, 2024

Cyber Risk Intelligence: What You Don’t Know is Most Definitely Hurting You

Posted on June 20, 2014 by in Security

Cyber Risk Intellitence

Growing up, one of my father’s favorite sayings was “luck favors the prepared.”

I must have heard it a thousand times over the years. It was almost always spoken just after some sad scenario where I had failed to stay alert, informed and aware, thus my ending up at a loss. Sometimes a big loss. It was his belief that, if you’re always broadly observant of things that affect your life, good things have a better chance of happening to you. He has always been right.

Nowadays, I find myself applying this lesson to cybersecurity and cyberdefense.

More than just nifty tools and solutions, robust IT budgets, threat intelligence firehoses and rigid security policies, I’m learning over and over again that practical, habitual day-in/day-out awareness is invaluable at helping you avoid becoming a victim of cybercrime – and lessening the impact when cybercrime inevitably happens to you and your organization.

Cybercrime is all around us.

One day it may become second nature to stay constantly informed about cyber risks facing us and our businesses. We’re certainly not there yet. Sooner or later, we may all need to get used to the idea of constantly consuming data about our risks and vulnerabilities in order to act safer. It’s likely sooner rather than later. To really accomplish this type of awareness, though, takes the right levels of information. Not just data. In fact, we’re all awash in data. But more on that later.

What we need is high-quality cybercrime information that’s comprehensive, yet also focused and simple to digest. Information that’s current, consistent, intuitive, continuous and, most importantly, easy to draw conclusions from that have meaning specific to you, your business and the decisions you face. It’s what I call “complete context.”

And there’s more.

To truly benefit from this sort of information takes more than just the info itself. Just as my father also told me, it takes focus, effort and commitment. Every day. Something he just called “hard work.”

Current Data + Contextually-Relevant Info + Continuous Awareness + Hard Work = Practical Solutions

Of course, the familiar modern-day version of my father’s favorite is “Chance favors a prepared mind” said by Louis Pasteur, French microbiologist, father of Pasteurization, and father of the Germ Theory of Disease. For Pasteur, the saying meant that, by staying diligently informed of all things surrounding your problem space, you’ll more quicker see solutions for tough problems.

For years and years he labored at the microscope, observing, collecting data and analyzing. But it was his devotion to basic research on more than just the problem itself – and the quick delivery of practical applications based on what he learned –  that led him to his biggest breakthroughs against unseen and deadly illnesses. Eventually, thanks to Pasteur’s way of working, we developed critical medicines such as antibiotics.

Studying a problem from every angle and every level always leads to more practical solutions and quicker (re)action.

Although Pasteur labored in the medical and biological fields, his work was in many ways analogous to modern cybersecurity. Today, scientists and researchers battle similar unseen forces, all around us, making us sick in various ways. Our networks and computers and mobile devices are constantly exposed to harmful pathogens and viruses. And, with the Target breach and things like Heartbleed, real people now know these things are fatal in their own way.

But in today’s world, we seem to have gone off track a bit in trying to cure our cyber ills.

In perhaps what was much the same as in Pasteur’s day, many smart people today labor to observe, collect data and draw conclusions. However, most of them, unlike Pasteur, are not able arrive at real practical breakthroughs that change the world.

Why is this the case?

For me, it’s mostly a simple answer:

We focus so much on looking down the barrel of individual microscopes, we get lost in all the low-level noise that’s far too focused on only a few dimensions of the problem.

Let me use Pasteur again to explain more simply.

Had Pasteur only observed the smallest bits floating around under his glass, he would’ve likely not been remembered in history. Instead, Pasteur gathered data about sick people, who they were, where they lived, how old they were, what gender, what symptoms they had, what prior illnesses they had been subject to, what their jobs were and what they had in common.

He observed animals, how they behaved, how long it took for them to become sick when they did, what they ate, where they lived and more. He even observed how rotting meat behaved, how it decomposed, how it compared to other plant and animal matter and on and on. He focused on all sides of the issue; the causes, the victims and, of course, their symptoms. Pasteur observed every facet of his problem set from high level to low, and turned basic data collection – from many dimensions at once and from all angles – into information he could use to draw practical conclusions.

Put simply, Pasteur had complete context by performing “intelligence gathering.” But, by focusing on more that just the threat itself, Pasteur was one of the first practitioners of risk analysis, or risk intelligence. It’s something we’ve only just begun to really apply to cyberdefense.

Continuous awareness of our own cyber risks compared to what’s possible and what’s happening around us right now is one of the missing pieces in current cyberdefense practices.

Today, we spend most of our cybersecurity efforts and dollars gathering massive amounts of data from millions of “microscoped” sources, but we rarely change perspectives or levels. We want to know what’s threatening us, but can’t seem to understand the picture is much bigger. Too rarely do we push back from the lenses trained only on data sets inside our specific organizations to pick our heads up and look around.

I like to call it “cyber navel gazing.”

You see, outside the microscope, there’s just so much other useful data – mostly not being stored and analyzed – that can be turned into helpful information, then into practical solutions.

Yet, we continuously employ 10s of 1000s of myriad tools, solutions and applications that comb through huge bins of raw packet data and endless streams of netflow and long-term signature repositories and terabytes of log files and interface dumps and more.

In fact, it’s as if all we do is peer through the scopes at our own micro worlds and draw conclusions that themselves lead to other tools begetting other massive piles of micro data.

Are these things all bad? Of course not. And they’re all part of fighting the fight against cyber disease. But in all of this we miss out on the bigger picture. Rarely do we store data, day in and day out, on what we’re getting hit with, how threats are occurring and what’s happening as a result. Neither are we matching that up to what our specific, individual symptoms are, who we are as targets, where we’re from, what types of companies we are, who our customers are, what technologies we’re using and on and on.

What would Pasteur say to us now if he were brought in to consult on our cyber sickness?

He’d probably just say, “Luck favors the prepared.” Then he’d tell us to start over. From the top this time.

Jason Polancich founder and Chief Architect at SurfWatch Labs. He is a serial entrepreneur focused on solving complex internet security and cyber-defense problems. Prior to founding SurfWatch Labs, Mr. Polancich co-founded Novii Design which was sold to Six3 Systems in 2010. In addition to completing numerous professional engineering and certification programs through the National Cryptologic School, Polancich is a graduate of the University of Alabama, with degrees in English, Political Science and Russian. He is a distinguished graduate of the Defense Language Institute (Arabic) and has completed foreign study programs through Boston University in St. Petersburg, Russia.

Previous Columns by Jason Polancich:


SecurityWeek RSS Feed

Don’t Forget DNS Server Security

Posted on March 17, 2014 by in Security

Late last August, some visitors to the New York Times website received an unexpected surprise – the website was down.

The source of the interruption was not a power outage or even a denial-of-service attack. Instead, it was a battle against a DNS hijacking attempt believed to be connected to hacktivsts with the Syrian Electronic Army.

The attack was one of several in 2013 that focused on DNS (domain name system) infrastructure, and security experts don’t expect this year to be all that different – meaning organizations need to stay aware of DNS security threats. 

Just last month, domain registrar and hosting provider Namecheap was hit with a distributed denial-of-service (DDoS) attack targeting its DNS platform that impacted roughly 300 sites. Beyond DDoS, attackers can also compromise a ame server and redirect DNS queries to a name server under their control. 

“DNS providers are often targets of attack because they are a central point for disrupting all services, web, mail, chat, etc. for an organization,” said Michael Hamelin, lead X-Force security architect at IBM. “The DNS server is the roadmap for the Internet, and once disrupted, services that are the lifeblood of the organization such as web, mail, and chat become inaccessible. If a DNS provider goes down, it could mean that thousands of customers have their digital presence temporarily erased.”

In the case of the New York Times, the attack that affected their users occurred when someone accessed a reseller account on Melbourne IT’s systems and changed the DNS records for nytimes.com as well as other domain names such as twitter.co.uk. This kind of password theft can have far-reaching implications, said Hamelin, who recommended DNS providers use two-factor authentication and “enable a restricted IP block requiring all edits to be made internally on the network.”

“Organizations need to understand that just because they have outsourced their hosting and DNS, it doesn’t mean that they’re guaranteed that the vendor has taken adequate security precautions to provide a highly available and secure service,” he said. “The organization needs to anticipate their DNS may become a target of an attack, and implement countermeasures such using two different DNS systems and/or hosting providers.”

By its very nature, DNS is one of the weaker links in many infrastructures, said Vann Abernethy, senior product manager at NSFOCUS, adding that the company had seen an increase in both DDoS attacks on DNS infrastructure last year as well as the use of DNS to amplify traffic. Juxtaposed with the critical nature of its operation, its status as a weak link makes it an enticing target for attacks, he said.

“There are quite a few variants of DDoS attacks that can be executed against DNS servers, such as DNS Query Flood – a resource consumption attack aimed at a single infrastructure,” Abernethy said. “And there are new ones cropping up as well.”

Among those is a technique similar to a DNS amplification attack that relies on the attacker sending a query with fake subdomains that the victim DNS server cannot resolve, flooding the DNS authoritative servers it must contact, he said.

Fortunately, there are a number of actions organizations can take to improve DNS security. For starters, don’t run open resolvers, advised Mark Beckett, vice president of marketing for DNS security vendor Secure64.

“Open resolvers allow anyone on the internet to query a DNS resolver, and are widely used by botnets to inflict damage,” he said. “[Also] don’t allow spoofed IP addresses to exit your network. Organizations should set egress filters so that only packets with IP addresses within their network address space are allowed to exit their network. This eliminates the ability of the attack to spoof any IP address that it wishes from an infected machine.”

He also suggested organization use rate limiting capabilities within their DNS server if possible, and monitor the network to detect any sudden spikes in DNS packet rates or inbound or outbound DNS traffic volume.

“Early detection of an attack can allow an organization to take defensive measures (like blocking attack traffic upstream at the router or firewall) before the attack is severe enough to impact their users or their network,” he said.

DNS-related attacks will continue to be a theme of 2014, Hamelin said, noting there aren’t a lot of steps in place to protect organizations from a hijacked DNS server or its clients.

“Attackers are focused on ROI [return on investment] and attacking a DNS server could be a great way to have a large impact with little effort,” he said. 

Brian Prince is a Contributing Writer for SecurityWeek.

Previous Columns by Brian Prince:


SecurityWeek RSS Feed