#MSOMS Log Analytics – security auditing and more with confidence


I originally wrote this article for the Italian market, but I quickly realized it could be of general interest. indeed, IT administrative auditing is a practice that crosses different markets, countries, laws and compliance directives. I’m not a lawyer, but at the same time I live in this IT world and I’ve been involved in many IT auditing projects. I’ve seen many different approaches, from minimalist to everything and more. Whichever was the approach and the chosen solution, for every on-premises implementation I’ve always stumbled upon the same issues:

  • The amount of data is huge, in many cases it is the largest database in town. A huge database means high ownership costs, starting from the data protection measures. After all, you cannot afford to lose your auditing data and RPO and RTO are critically important. There are indeed solutions that promise to reduce the data volume but they either do this by compromising performance or stripping away properties from the collected data. Neither approach is brilliant.
  • Huge databases mean slow query performance or expensive solutions to keep it running at the proper speed.
  • While it is possible to guarantee data completeness, integrity and durability, it’s almost impossible to guarantee data inalterability, or at least without an organizational process and the nomination of an internal auditor or Data Protection Officer with exclusive access to the audit vault. Obviously this auditor must not have administrative access to the systems subject to auditing (the old story about who controls the controller). This requirement is hard to achieve in small and medium businesses and often they skip this topic or use expensive external auditors. Technically, to achieve data inalterability, every single event should be signed, public time stamped and at the same time the entire stream should be signed and time stamped in (near) real time. I knew of a solution from a Spanish company, but unfortunately their solution was not scalable to the thousands of events per second, they’re not out of business.

I honestly believe that the cloud could be of great help in this scenario, the above issues can be solved with a solution which:

  • Gets the data in (near) real time
  • Stores it in an external vault, outside the control of the audited, in a secure and mutually authenticated way.
  • Sets the collected data as not modifiable or at least not from the audited ones.
  • Gives guarantees about the data integrity. It’s technically hard not to manipulate the data stream, if not just for indexing purposes. What’s needed is a clear statement and opportune guarantees no property in the data is lost.
  • Keeps the response time for queries down to seconds or tens of seconds.
  • Gives you the ability to export the data, build dashboards, create alerting conditions.
  • Gives you the ability to extend the scope of the collected data, ‘cause we all know applications tend to build their own audit trails.

Microsoft Operations Management Suite – Log Analytics (LA) can be such a solution. Completely cloud-based it let’s you ingest your audit trails and much more. In fact LA is a general purpose log ingestion and analysis system. Where “log” is reductive, in fact LA can ingest any sort of electronic document with structure or semi-structured schema (logs, text files, performance data points…).

Once the data is in the LA vault (called workspace) it can be queried, extracted to Excel or pushed to PowerBI, you can build your own dashboard, ingest your own logs and build alerting and remediation upon that data.

LA is based on a common engine and a set of growing solutions. Every solution defines data sources and analysis intelligence upon the collected data. In this article I’m focusing on the security solution. As the name implies security is much more than simple auditing:

  • It builds on different data sources called “Security Domains”. Not just audit trails from the different platforms, but even indicators from antimalware, patching, network communications and much more
  • It extracts potentially troublesome situations in “Notable Issues” that can be extended with your own conditions and on which alerts can be defined
  • Last, but not least, it adds on top of the collected data the machine learning intelligence based on the Microsoft security team operated “Threat Intelligence”

OMSLAAuditing-1

Figure 1- The security solution default dashboard

Log Analytics and common audit trails requirements

There are common requirements for audit trails I found in every project I’ve been involved in, let’s say these are common minimum denominators:

  • Completeness: it means no data must be lost either during collection or during manipulation of the data itself. LA addresses this requirement collecting data directly from the source (security event log, audit trails, syslog, O365 logs…) and implementing a near real time get and send data ingestion.
  • Inalterability: this is where a cloud based solution really shines and outpaces an on-premises one. The data is sent though a mutual authenticated, encrypted channel and once it gets to the workspace is basically out of reach. There’s no way someone can get to that data and modify it once it is stored. Unless you’re able to know on which storage account it is recorded, are able to physically breach inside an Azure datacenter and you have the skill and computational power, once inside the datacenter, to break the encryption algorithm. Uhm I would say it is indeed safe.
  • Integrity: data integrity is a guarantee the service gives you, as for every cloud-based service, if you don’t trust the provider don’t use that service.
  • Durability / retention: like for integrity this is something the service guarantees, in terms of retention you can opt for 1 month or 1 year. This can be a blocker because I have a few customers that requires 5 and 10 years of retention.

How Log analytics works

You can find many blog posts on the subject and obviously all the documentation you need on Technet, but let’s recap the fundamentals. Log Analytics is a cloud operated log ingestions system hosted on Microsoft Azure. Today it gives you three different retention options:

  • 1 week for the free tier
  • 1 month for the standard tier
  • 1 year (actually 390 days) for the premium tier

Remember this is cloud based so things are set to change and improve rapidly.

La gest data from different sources:

  • Through SCOM management groups
  • Via agents installed on Windows or Linux machines
  • Through Azure diagnostics recorded in azure storage accounts
  • Through direct ingestion API with other cloud operated solutions for example Office 365 or Application Insights.

Once the datasources have been connected the data starts to flow into the workspace, here it is categorized and indexed and all the magic happens until you data is ready to be queried through the custom query language and obviously from the various solutions dashboards. You can even get to the data via a REST interface, based on your preferences even encapsulated in powershell cmdlets.

Microsoft guarantees the durability and integrity of the data, so you don’t have to implement any data protection solution, it’s all in the package.

For the agent based collection the solution implements a 2-hour-worth local cache, trying to reconnect every 8 minutes.

OMSLAAuditing-2

Figure 2 – the Log Analytics architecture (source Technet)

Where is my data stored?

This is another common topic, my audit data is invaluable I have a requirement both internally and for compliance to keep it in specific regions. Sounds familiar? This is another easy win for Azure, you can always know where your data is and moreover you can chose where to store your data. The datacenters where Log Analytics is implemented are greatly increasing. For example in Europe we’re jealous about our data assets and we require it be inside the European Union (as flongar as it lasts, I’m writing this article just after the BREXIT affair, how many chances will be wasted before we learn how to really be a community?).

Being based on Azure it inherits all the environmental security from the datacenters and it is specifically certified for:

OMSLAAuditing-3

Figure 3- privacy in LA (source Technet)

What data can I ingest?

AS stated previously LA is a general purpose solution, we use it to collect data from windows systems up to old IBM AS400 systems. I took some time to fill a summary table with specifics related to security ingestion.

data type platform Direct Agent SCOM agent Azure Storage SCOM required? SCOM agent data sent via management group collection frequency
Windows security event logs Windows OMSLAAuditing-OK  OMSLAAuditing-OK  OMSLAAuditing-OK  OMSLAAuditing-NA  OMSLAAuditing-NA For Azure storage: 10 min; for the agent: on arrival
Windows firewall logs Windows  OMSLAAuditing-OK   OMSLAAuditing-OK   OMSLAAuditing-NA   OMSLAAuditing-NA   OMSLAAuditing-NA on arrival
Windows event logs Windows   OMSLAAuditing-OK   OMSLAAuditing-OK   OMSLAAuditing-NA   OMSLAAuditing-NA   OMSLAAuditing-OK for Azure storage: 1 min; for the agent: on arrival
Linux audit trail Linux   OMSLAAuditing-OK   OMSLAAuditing-NA   OMSLAAuditing-NA   OMSLAAuditing-NA   OMSLAAuditing-NA Via SysLog
Network / syslog various   OMSLAAuditing-OK   OMSLAAuditing-NA   OMSLAAuditing-NA   OMSLAAuditing-NA   OMSLAAuditing-NA from Azure storage: 10 minutes; from agent: on arrival
Other via custom logs various   OMSLAAuditing-OK   OMSLAAuditing-OK   OMSLAAuditing-NA   OMSLAAuditing-NA   OMSLAAuditing-NA 15 min
O365 SaaS n.a. n.a. n.a. n.a. n.a. On arrival, available to be queried < 5’
Other via Progel Security Log Gateway various   OMSLAAuditing-OK   OMSLAAuditing-OK   OMSLAAuditing-OK   OMSLAAuditing-NA   OMSLAAuditing-NA For Azure storage: 10 min; for the agent: on arrival

 

This point isn’t intended as advertising, just a way to show how easy is to integrate any log into LA

The company I work for (Progel spa) developed this Progel Security Log Gateway of ours back when we used SCOM ACS to implement security auditing. Now we have improved it for LA, it basically gets events from different sources and translates them to “security data” so that you can achieve the near real time channel reserved to this type of data. We currently process:

  • Oracle 10i+
  • Exchange 2007+ on premises
  • SQL Server 2005+
  • Custom event-log transformation

To sum up

Microsoft Operations Management Suite Log Analytics is a cloud-based solution that can address your auditing needs (and much more). We’re seeing a tremendous interest in it in Italy and for sure it is worth a look if your current solution has grown out of control, it is slow, you cannot guarantee data inalterability and basically you’ve had enough of managing a required service but that it’s not your core business.

-Daniele
This posting is provided “AS IS” with no warranties, and confers no rights

Advertisements

  1. Leave a comment

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

%d bloggers like this: