Alphabet unveils new tool to analyze massive troves of data on computer networks - Rickey J. White, Jr. | RJW™
22806
post-template-default,single,single-post,postid-22806,single-format-standard,ajax_fade,page_not_loaded,,qode-theme-ver-16.3,qode-theme-bridge,wpb-js-composer js-comp-ver-5.4.7,vc_responsive
 

Alphabet unveils new tool to analyze massive troves of data on computer networks

Alphabet unveils new tool to analyze massive troves of data on computer networks

Most modern computer networks are constantly generating massive amounts of logging data, including websites and other systems accessed from within the network.

In theory, those logs can be helpful for figuring out exactly what happened when in a security incident, but in practice they can be too massive to efficiently comb through when something goes wrong.

A new product unveiled Monday by Chronicle, Alphabet’s cybersecurity unit, at the RSA Conference in San Francisco aims to change that. The tool, called Backstory, provides the ability to sift through even massive collections of log data, essentially instantaneously using Google’s cloud capabilities.

[Image: courtesy of Chronicle]

“Take the example of an organization that wanted to run a search on over a petabyte of security data to find out if any of the 25,000 employee workstations ever communicated with a specific foreign website hosting malware,” the company said in a blog post. “The search might take 30-60 minutes in with current industry solutions, but in Backstory it takes less than a second. Let’s say the organization wants to search 50 petabytes of logs, not one. The current industry solutions might now take 12 hours, but it’s still around a second in Backstory. ”

That type of search isn’t just theoretical: In a blog post, Chronicle CEO and cofounder Stephen Gillett points to the indictment unsealed in July against Russians allegedly behind the Democratic National Committee hack. The indictment referred to a domain name, linuxkrnl.net, they allegedly used in the attack.

But while system administrators likely wished to see whether their own networks communicated with that domain during the time of the attack, doing so would usually be impossible, since most companies don’t keep traffic logs around for very long. Using Backstory, Gillett writes, they’d be able to retain logs as long as they wished, and efficiently search them for links to newly revealed security threats.

While the log data is stored in the cloud, it isn’t available to anyone besides the companies that upload it, or scanned by any automated systems, according to Chronicle. Users of the system will be billed by employee count rather than amount of data uploaded, so they won’t have to pay more to store more data, according to the company.


Source: Fast Company

Tags:
No Comments

Sorry, the comment form is closed at this time.