DLP and the Current Financial Market

Enterprise class DLP (Data Leakage Prevention) is entering its 10th year and has matured to see leaders in the space like Vontu get acquired by Symantec and Reconnex to McAfee. One of the reasons these vendors were merged into larger companies is because they are offering comprehensive solutions and getting a premium price for guarantees of complete protection from your laptop through to the great ethereal cloud.

Don’t believe the hype and don’t pay for it.

If you are new to data leakage prevention, outbound content compliance, or extrusion detection, you will be confronted with all sorts of new terminology and new technology.  This blog entry is a casual warning for readers to avoid getting caught up by the propaganda of these DLP vendor’s claims.  In this fragile economy it is time to take a closer look at a sensible and fiscally sound approach to securing your company’s most important digital asset: email.

Email is the most important electronic business communication any company has.  Web servers, ftp servers, print servers and instant messaging, while important, never cause as much stress to an organization as when their email system goes down.  Clearly the uptime for email servers is the most critical of all corporate communication systems. The monitoring of email, web traffic, p2p, ftp and instant messaging within DLP is called data in motion. Leading DLP vendors connect to a span port of a switch and watch as a duplicate stream of data is sent through their collectors.

More advanced and much more expensive solutions attempting enforcement using Internet Content Adaptation Protocol (ICAP) have been available, but very few companies actually use this man in the middle or proxy based technology. In some cases these vendors have even built their own mail transfer agent (MTA) to monitor and enforce email. To date a great majority of Fortune 1000 companies have chosen not to trust their email architecture to any of these vendors.

The process of monitoring TCP traffic is very complex.  The collection and re-assembly of packets for the purpose of locating unacceptable Internet usage, compliance violations or the actual loss of company sensitive material is the reason these appliance and software solutions are purchased. However, these sniffer-based solutions have two major flaws. Some companies use a span port on a switch for content monitoring however during high traffic periods or during an excessive inspection of large file sizes, packets of information are sometimes dropped and therefore remain unmonitored. What is worse is that there are no indications to reviewers of the types of information or how much information is lost. In addition, protocols like FTP and AJAX libraries make packet reassembly difficult and often times impossible to reconstruct.  Even if messages are not dropped altogether or ignored, large file sizes are only partially monitored. A seven megabyte file may only have the first five megs scanned. This means that while you may think you are protected, you aren’t.

Instant messaging (IM) has its own issues to consider. The number one problem is that most vendors cannot reassemble an entire discussion between two users. This means that context is entirely lost.  In addition detection methods do not speak the language of IM and the policies used to detect data leakage are literally not written in the right language. Do u know what I mean?

One last difficulty DLP vendors have is isolating a specific user to a specific IP address. Network Engineers know that DHCP or NAT addresses make the correlation to specific end users very difficult. You can’t fire or prosecute an ip address.

All of these flaws in the inspection of content should make security and messaging teams consider where they should spend their tight budget dollars.

Consider the fact that email is already branded. It already contains the desired recipients and the end user or sender is always known.  Clearly the email domain is the most recognized electronic corporate asset.  Using sniffer products to protect email is inadequate which means that one mail message, with your corporation’s name and sensitive information, can easily be leaked. Fortunately, enterprise class message processing of data in motion is not susceptible to packet loss or malformation, and because mail delivery can be temporally delayed, large messages do not have to be prematurely delivered before a full content scan can be completed.

Data in motion SMTP traffic is the easiest to monitor and control.  Policies written to detect and control email messages are completely monitored and never dropped on the floor. Message processing engines like Sendmail’s Sentrion are absolutely the best method for protecting your most valuable business communication asset. Leading DLP vendors are charging excessive license fees and making unsupported claims of comprehensive protection. Yet, they remain incapable of building enterprise class email architectures and are susceptible to known but immeasurable amounts of data leakage by the most common characteristic of enterprise scale networks, and that is the high volume of data.

Port scanning based DLP or DLP-centric vendors attempting to manage and control email with passive monitoring, expensive proxies, or their own half-baked MTAs are guaranteed to fail when used in a true enterprise class environment.  Fortune 1000 companies are well aware of these known limitations and they know that it is simply not worth the risk of using DLP vendors to deliver or protect their email.

It doesn’t make sense to pay more to get less or pay too much for a false sense of security. Only enterprise class message processing can fully protect your most important communication asset.

What do you think?  Let me know what your view is on DLP by leaving a comment.

This entry was posted in Daniel K. Hedrick, Uncategorized. Bookmark the permalink.

2 Responses to DLP and the Current Financial Market

  1. Andrew says:

    Your statement:
    ‘In addition, protocols like FTP and AJAX libraries make packet reassembly difficult and often times impossible to reconstruct.’

    is quite frankly wrong. Packet reassembly is a TCP concept and has well defined algorithms on how to do it.

    FTP is not difficult to monitor and AJAX makes very little difference as the underlying requests are all still just plain old HTTP.

    Most IM clients use the SIP protocol and again this is relatively easy to monitor.

  2. Andrew

    Thank you for your pithy response.

    For clarification, monitoring the content is not difficult, however, for monitoring applications to deliver useful information they must be able to re-assemble these detected events for comprehensive managerial review. Below is my understanding of the technical limitations that these protocols bring to DLP.

    Active FTP has a control session and a data session. This means that while each can be monitored they may not be successfully aggregated. This difficulty is not necessarily a problem with DLP but rather one of the consequences of using FTP. As an example if a network uses NAT, the NAT gateway must be stateful. The gateway needs to translate the IP address to the address assigned to the client. If the gateway doesn’t perform these operations correctly, FTP can fail. DLP applications need to maintain state and recombine the content. This frequently does not occur because unless all parts of the data stream are captured the reviewer is left to review only partial content.

    Ajax applications running on a browser communicate with the Web server in an asynchronous manner and only update portions of the page. Applications that take advantage of Ajax techniques provide a rich, browser-based user experience, however for a DLP system to replay or recombine all of the content into a meaningful state would require the capturing of content that is not considered sensitive or at risk, thus making page reconstruction for reviewers of detected content difficult.

    Lastly, within IM there must be a pre-determined amount of time that a session can be left open. If there is a significant amount of time (whether it is 50 seconds or 50 minutes) then the “conversation” will be broken. The context of a conversation is lost and the ability to monitor based on the context is also lost. Many DLP vendors are limited to monitoring only one message at a time. One interesting solution is to collate all messages from two users over a predetermined amount of time and then send the content in its entirety to the Message Processing Engine for a full context analysis (We currently have a solution with Facetime that does this very thing). The only shortfall here is that there is no ability to block or stop the communication.

    The purpose of the post was an attempt to illustrate that the above difficulties encountered within these other protocols are not present in SMTP, therefore, whatever funds you are prepared to spend to protect your brand and reputation should be spent on email.

Leave a Reply