2 articles Tag cyber-security

Raphael Marty on the need for more human eyes in sec monitoring

Raphael Marty spoke at the 2013 (ACM) conference for Knowledge Discovery and Data mining (KDD’13). It is a very enlightening talk if you want to learn about the status of visualization in computer network security today and core challenges. Ever growing data traffic and persistent problems like false positives in automatic detection cause headaches to network engineers and analysts today, and also Marty admitted often that he has no idea of how to solve them. As he has worked for IBM, HP/ArcSight, and Splunk, the most prestigious companies in this area, this likely not because of lacking expertise).

Marty also generously provided the slides for his talk.

Some key points I took away:

Algorithms can’t cope with targeted or unknown attacks – monitoring needed

Today’s attacks are rarely massive or brute force, but targeted, sophisticated, more often nation state sponsored, and low and slow (this is particularly important as it means you can’t look for typical spikes, which are a sign a mass event – you have to look at long term issues).

Automated tools of today find known threats and work with predefined patterns – they don’t find unknown attacks (0 days) and the more “heuristic” tools produce lots of false positives (i.e. increase the workload for analysts instead of reducing it)

According to Gartner automatic defense systems (prevention) will become entirely useless from in 2020. Instead, you have to monitor and watch out for malicious behaviour (human eyes!), it won’t be solved automatically.

Some figures for current data amounts in a typical security monitoring setup:

marty_detectiontechnology___slideshare-zrlram

So, if everything works out nicely, you still end up with 1000 (highly aggregated/abstracted) alerts that you have to investigate to find the one incident.

Some security data properties:

marty_securitydata___slideshare-zrlram

Challenges with data mining methods

  • Anomaly detection – but how to define “normal”?
  • Association rules – but data is sparse, there’s little continuity in web traffic
  • Clustering – no good algorithms available (for categorical data, such as user names, IP addresses)
  • Classification – data is not consistent (e.g. machine names may change over time)
  • Summarization – disrespect “low and slow” values, which are important

How can visualization help?

  1. make algorithms at work transparent to the user
  2. empower human eyes for understanding, validation, exploration
    • because they bring
    • supreme pattern recognition
    • memory for contexts
    • intuition!
    • predictive capabilities

This is of course a to-do list for our work!

The need for more research

What is the optimal visualization?

– it depends very much on data at hand and your objectives. But there’s also very few research on that and I’m missing that, actually. E.g. what’s a good visualization for firewall data?

And he even shares one of our core problems, the lack of realistic test data:

That’s hard. VAST has some good sets or you can look for cooperations with companies.

Tags: , , , ,

“Potsdamer Konferenz für Nationale Cybersicherheit”

On Tuesday, 4th of June, the “Potsdamer Konferenz für nationale Cybersicherheit” took place at the Hasso-Plattner Institute in Potsdam, Germany. The main goal of the conference was to improve the communication between the government, economy and the different research fields in the issue of cyber-security. For us, it was interesting in two ways: finding the main actors to focus on in our research and learning how the current security situation is rated by the different organisations.

hpi

 The conference started with a few words of welcome from Director and CEO of the Hasso-Plattner Institute, Prof. Dr. Christoph Meinel. In his short Keynote, which was mostly about the work and research of the HPI IT-Security Engineering Team, he also introduced the audience to the new HPI-Vulnerability-Database.

The HPI-VDB portal is the result of research work being conducted by IT-Security Engineering Team at Prof. Christoph Meinel’s chair “Internet Technologies and Systems” at HPI. It is a comprehensive and up-to-date repository which contains a large number of known vulnerabilities of Software. The vulnerability information being gathered from Internet is evaluated, normalized, and centralized in the high performance database. The textual descriptions about each vulnerability entry are grabbed from the public portals of other vulnerability databases, software vendors, as well as many relevant public web pages, etc. A well-structured data model is used to host all pieces of information which is related to the specific vulnerability entry. Thanks to the high quality data serialized in the high performance In-Memory database, many fancy services can be provided, including browsing, searching, self-diagnosis, Attack Graph (AG), etc. Additionally, we offer many types of API for IT developers to leverage our database for their development. (http://www.hpi.uni-potsdam.de/meinel/security_tech/hpi_vdb.html)

panel

A lot more interesting speakers have been invited to talk from their perspective of cyber security. For example the director of the European Network and Information Security Agency (ENISA) Prof. Udo Helmbrecht made a keynote speech addressed to policy- and decision-makers such as the Bundesland Brandenburg-Ministerpräsident, the Federal Minister, as well as industry representatives and others.

In Focus of our research, this conference was not the very best place to lern new things. But the possibility to make new contacts and meet interesting people in generell was great and we now have a few names to work with in the future time. Also the knowledge of the actors and so called: “big player” in the business is good to have.

A short film about the conference was uploaded on youtube. This video was made by hpi tv and sums up the conference pretty well. (GER only)

Tags: , , ,