Tuesday, April 22, 2014

AppSensor and MVC: Unexpected HTTP Commands

So, I have decided to try and implement the some of the OWASP AppSensor controls inside an MVC project.  I will go into details about the end goal of this project in another post, but for now I am going to talk about the Unexpected HTTP Commands detection point.

This is a first stab.  Currently the only response is to log the request in the logs.

In MVC, the current set of http method controls are implemented as ActionMethodSelectorAttribute.  What this means is that you apply the control on a method by method basis.  This approach will probably be good when implementing other controls, but for this one, we need something more global.

This article has a really good explanation of action filters and how to use them.  The filter itself is quite simple, since all we are doing is comparing the reported http verb to a list of expected ones.

        public void OnActionExecuting(ActionExecutingContext filterContext)
            if (!Enabled)

            var httpMethod = filterContext.HttpContext.Request.GetHttpMethodOverride();

            if (
                    x =>

            httpMethod = SanitizeHttpMethod(httpMethod);


            filterContext.Result = new HttpNotFoundResult();

A couple of notes.

1)  filterContext.HttpContext.Request.GetHttpMethodOverride() returns the http method as listed in the request.  Remember that this is untrusted code and should be sanitized before writing to the logs.

2)  Verbs can actually be restricted earlier in the pipeline.  Firstly, you can use Request Filtering to limit the accepted verbs.  This method works, but doesn't allow the application to handle the request and thus limit our ability to build in the response mechanisms.  Secondly, in the handler definition you can define the list of verbs to respond.  If you try to use a verb that isn't on this list, you will get an HTTP 405 error.  Both these mechanism are great (and log to the IIS logs) but they don't allow our application to handle and keep track of violations.  To bypass the handler definition, you can use * in the accepted verbs list.

The code is located in github

Tuesday, April 15, 2014

Some thoughts on Secure File Uploads (MVC / C#)

Recently, I was asked to take a look at a solution that involves receiving file uploads via an MVC application that would then be later processed by an internal system.  MVC already has built in functionality to receive files, so that was no problem.  But what about the security aspects of doing something like this?  Before I dive into some of the things you can do to secure your file transfer, I want to touch on a couple of things.

1)  Verifying files and file types is hard!  There is a project that is available that tries to keep track of the magic numbers for each file type.  A quick scroll through there and you will notice that in some cases, multiple extensions use the same magic numbers.  In other cases, different versions of the application use different file types and magic numbers.  It is a mess!

2)  The verifying that can be done in the application often isn't enough.  We have to remember that we cannot rely on the MVC application to solve all problems.  Even the server side checks that can be implemented are not full proof.  Furthermore, this only verifies that a file is of the type it claims to be (or that you are looking for).  One would still have the problem of ensuring the file was actually "safe" for consumption by the target application.  At the outset, this seems more of a problem for IDS/IPS/Virus Scan than it does the MVC application.

3)  This type of functionality should have already gone through some sort of security review well before it reaches the development team.  The risk of uploading files should be pointed out as well as all the compensating controls that are in place to help mitigate that risk.  For example, is this page authenticated?  Is there sufficient logging?  Is there and IDS/IPS in the network?  Do the files get virus scanned before being consumed?  Can the files be "detonated" on a segregated machine before being given to end users?

As usual, OWASP has a great write-up on this.  They cover a lot of the key points you would want to look at when building your application.

First thing is first, you have build your form/app to receive files.  This is easily done.  MVC4 has the concept of HttpPostedFileBase which is the type the file shows up as.  A couple of notes, you must make sure that the name of your input field matches that of the parameter input to your controller method.  Secondly, you must set the enctype of the form to multipart/form-data.  See this post for more info.

One last thing, the checks below take into account a lot of assumptions about the types of files you want to accept.  You will most probably tailor them to your specific example.  In my case, I am focusing on docx or xlsx files.

Check 1:  Is a file actually there?
Basically this is the equivalent to the ModelState.IsValid() method.  You want to make sure you are actually processing a file.

  if (file == null){
   return View("Error");

Check 2:  File Size
One of the first checks you are going to want to do is around file size.  You can use the request limits feature of request filtering to limit the maximum size you would like to handle.  A couple of notes is that this setting is a global setting, and not page specific.  Further to this, you cannot set a "minimum" size.  You could check for file size in the code (just call length on the input stream), and this might be a valid check in your case.

Check 3:  File Name
We have to remember here that the interpreted file name and content type is actually passed directly from the http request.  This means that it is untrusted client side input and must be validated.  You may want to run a series of checks, but the basics would be length of the file name and a regular expression match.  Here you will want to make some assumptions about the types of files/file names you will be getting.

  if (file.FileName.Length > 255)
    return View("Error");

  var regex = new Regex(@"^[\d\w -]+\.[\w]{4}");
  if (!regex.IsMatch(file.FileName))
    return View("Error");

  var acceptedExtensions = new List() {".docx", ".xlsx"};
  var fileExtension = Path.GetExtension(file.FileName);
  if (acceptedExtensions.All(x => !x.Equals(fileExtension)))
    return View("Error");

In my case, I am looking specifically for docx and xlsx files.  So I can take those assumptions and work with the code.

Check 4:  Content Type
Generally, content type is generated from extension present on the file name.  This behavior varies between browsers and potentially across OS platforms.  Never-the-less, it is information we can look at as a first check.  In the code below, I just compare it to a white-list.  You could take this farther by keeping a mapping of extension to accepted content type.

            var acceptedContentTypes = new List()
            if (acceptedContentTypes.All(x => !x.Equals(file.ContentType)))
                return View("Error");

Check 5:  FindMimeFromData Call
Windows has a built in call that does a static check to determine 26 different mime types.  The problems with this are that you have to use interop.  This is the same check that is done in internet explorer, but the benefit is that server side you can control what parameters you call the program with. 

There are a few steps here to using this.  First you have to add the call definition to your code.

        [DllImport(@"urlmon.dll", CharSet = CharSet.Auto)]
        private extern static System.UInt32 FindMimeFromData(
            System.UInt32 pBC,
            [MarshalAs(UnmanagedType.LPStr)] System.String pwzUrl,
            [MarshalAs(UnmanagedType.LPArray)] byte[] pBuffer,
            System.UInt32 cbSize,
            [MarshalAs(UnmanagedType.LPStr)] System.String pwzMimeProposed,
            System.UInt32 dwMimeFlags,
            out System.UInt32 ppwzMimeOut,
            System.UInt32 dwReserverd

Next you have to get the header and call the function. This is a rough idea of what you would do.
            var buf = new byte[265];
            file.InputStream.Seek(0, SeekOrigin.Begin);
            if (file.InputStream.Length > 256)
                file.InputStream.Read(buf, 0, 256);
                file.InputStream.Read(buf, 0, (int)file.InputStream.Length);

            System.UInt32 mimetype;
            var result = (int)FindMimeFromData(0, null, buf, 256, null, 0, out mimetype, 0);
            if (result != 0)
                return View("Error");
            var mimeTypePointer = new IntPtr(mimetype);
            var mimeType = Marshal.PtrToStringUni(mimeTypePointer);

            if (mimeType == null || !mimeType.Equals("application/x-zip-compressed"))
                return View("Error");
In the case of open-xml format, the file that actually gets sent is a Zip file.  So you can look for it there.  There are some alternatives to calling out to the interop.  You could figure out the header bits for a zip file and code the check manually.  See mimedetector or filetypedetective.

So at this point, let us recap.  We have done some basic file name checks.  We have also done some basic content type checks, but this data is built from the file name, so it can't be trusted.  We have run a sever side check and from the header, we can determine that it is in fact a zip file.  Pretty weak!

At this point, we could now rip open the zip file and start to have a look at what is inside.

Check 6:  Zip Verification

The code would look something like this.

            var isValid = false;
            using (var archive = new ZipArchive(file.InputStream,ZipArchiveMode.Read))
                foreach (var entry in archive.Entries)
                    if (!entry.Name.Equals("[Content_Types].xml"))

                    using (var contentTypeFile = entry.Open())
                        var xElement = XElement.Load(contentTypeFile);
                        var elements = xElement.Elements();
                        foreach (var element in elements)
                            if (element.FirstAttribute.Value.StartsWith("/word/") ||
                                isValid = true;

            if (!isValid)
                return View("Error");
In .net 4.5, they finally introduced the ZipFile/ZipArchive functionality.  You can use it to open up a stream and check what is inside.  All I am doing in the code above looking for a specific file and checking that file for an attribute that I know needs to be there (as per the standard).  If you think about it, all I am doing is verifying that I have received a zip file (that I can open) and it has a file with some key words in it.  Once again, pretty weak stuff!


Where do you stop? It really depends on the needs of your application.  There is no good way to handle this.  We can start to decrease the risk by adding all of these extra checks, but we can't mitigate it completely.  The entire solution needs to take file upload into consideration.

Tuesday, April 8, 2014

CCSK Certified!

After dragging my feet for a long time, I finally decided to write the CCSK.

It was a pretty comprehensive exam, but if you go through the two documents, you should be fine.  I found that I finished the exam in about 30 minutes and had about an hour to go over the questions again.  In my initial attempt I skipped about 10 questions.

The biggest problem I found was the wording of some of the questions.  I found reading out-loud helped a lot as there are some subtle things to watch out for.

In any event, passed with 88%!  Time to take the rest of the week off!

Thursday, April 3, 2014

CCSK Study: Domain 13 - Virtualization

  • Benefits of virtualization are well known but this also brings up various security concerns
  • VM Guest Hardening
    • Guests should still have firewall, hips, web app protection, antivirus, FIM, and log monitoring
    • Can be delivered on a guest by guest basis or via hypervisor based apis
  • Hypervisor Security
    • Hypervisor should be hardened based on best practices
    • Also consider physical security
  • Inter-VM Attacks
    • VM Communication can occur on the backplane and thus traditional network security is blind to that communication
    • Various solutions (in-line appliances on the vswitch, for example)
    • Also need to consider VM migration and how to keep track of traffic/flow
  • Performance Concerns
    • The resource cost of putting protection mechanism on each guest is great
    • Consider options at the hypervisor level
  • Operational Complexity from VM Sprawl
    • large attack surface due to many requests for VM and poor management of VMs once created
  • Instant-On Gaps
    • VM can be secured, and then turned off.  When turned off it could then go out of date (say with security patches).  
    • How do you deal with this VM when it is booted back up?
      • Could use NAC to prevent network access until patches are up-to-date
  • VM Encryption
    • images are vulnerable to theft or modification
    • images could be encrypted all the time
      • performance impact
    • Use DLP tools to prevent ex filtration of image
  • Data Co mingling
    • mixed-mode deployment (vms with different security classes hosted together)
    • need to use VLANs / firewalls / etc to ensure proper isolation
  • VM Data Destruction
    • Need to zero disks after migration
  • VM Image Tampering
    • pre-configured templates may not be what you think they are
  • In-Motion VM
    • how do you audit/track vm's in motion?  What if they cross jurisdictional boundaries?
  • Recommendations
    • identify type of virtualization in use for CSP
    • try to implement a zoned approach
    • secure each OS via guest-tools or API based tools
    • encrypt images when not in use
    • use secure baselines / hardening practices
    • ensure security assessment tools take into account virtualization
    • employ virtual patching techniques to prevent VMs on boot up / migration

This chapter is short and sweet.  The security concerns with virtualization are vast, unfortunately the solutions have not kept pace.  Any solution at this point would be a custom type of solution that involves multiple different technologies.  Even then, it will take some time for existing providers to update their software to reliably account for virtualization technologies.  Compound this with the sheer network speed of a backplane.  Capturing and analyzing all this traffic is a large feat. 

As a client of a service provider, my recommendation would be to make a list of all the security tools you require.  Try, as much as possible, to push the cost of those recommendations onto the provider.  You probably won't like the cost of mandatory antivirus when you are paying by the IOP to run it.

CCSK Study: Domain 12 - Identity, Entitlement, & Access Management

  • Concepts behind IdEA require fundamental changes when moving to cloud solutions
  • Traditional deployments of servers with a standard directory service do not scale well or provide the necessary features
  • Key Terms
    • Identity
      • The means by which an Entity can consistently and comprehensively be identified as unique
    • Identifier
      • the means to cryptographically assert and identity
    • Entity
      • user, code, org, agent, etc
    • Entitlement
      • the process of mapping privileges to identities and their attributes
    • Reduced Sign-On
      • account sync tool/process to minimize the number of credentials a user must remember
    • Single Sign-On
      • The ability to pass identity and attributes use secure standards such as SAML and OAuth
    • Federation
      • The connection of one identity repository to another
    • Persona
      • identity plus related environmental attributes
      • IE:  Fred Smith as CEO of ACME Corp
    • Attributes
      • the facets of an identity
  • Introduction to Identity in a Cloud Environment
    • Problem similar to moving from a small town where "everyone knows your name" to a large city
    • Key points to consider
      • Risk calculations need to keep in mind the strength of an identity
        • difference between self-assert vs CA for example
      • Identity and attributes may now need to come from multiple sources
      • There may be instances where a transient identity is sufficient
      • There may be instances where pseudo-anonymity is desired (IE: Voting)
  • Identity Architecture for the Cloud
    • Cloud solutions should be able to consume assertions about identity and attributes from multiple sources
    • Basic flow
      • Business rules are translated into a set of entitlement rules
      • An authorization engine takes assertions from multiple sources of identity and attributes and compares them to the entitlement rules
        • Also factors in the "strength" of the assertion
      • Authorization is granted and passed to an access management system
      • Access management system governs access to key areas of the solution
        • Network Layer
          • One may not even be able to "see"(ping) a cloud service
        • System Layer
          • Entitlement rules may define the medium by which a system is accessed.  For example, RDP vs app only, etc
        • Application Layer
          • Functions of the application may be scaled based on the entitlement rules
        • Process Layer
          • Processes within the application can be scaled
          • Some processes may require additional verification as specified by the entitlement rules
        • Data Layer
          • entitlement rules may limit access to subsets of data
          • IE: Row level security
  • Identity Federation
    •  Defined as an interconnection of disparate Directory Services
    • Technology should rest on open standards such as SAML
    • Already in use as the SaaS level, but no standards exist for PaaS and IaaS
    • Special considerations for the lifecycle of identities
      • Think shared accounts, named accounts, priviledged accounts
    • Existing systems for identity management need to be extended to support the cloud
  • Provisioning and Governance of Identity and Attributes
    • More than just user provisioning
      • Need an appropriate set of attributes for rich risk-based decision making
      • Examples
        • Org Identity
        • Location assertions
        • Training record / compliance
        • etc
    • Ideally, cloud service/application should not be the master source for identity
      • Some exceptions apply (Identity-as-a-Service)
    • Attributes need to be linked to identity (somewhere in the chain)
  • The Entitlement Process
    • As per Basic Flow Above
      • business requirements are used to determine the identities and attributes required to properly evaluate rules
      • consider all layers in access management
      • process does not stop with a launch of a cloud service.  Should consider the full identity lifecycle and periodic review
    • When identities and attributes are sourced from outside the business's control, need to consider on-boarding and off-boarding of those orgs
    • Considerations for enforcement points
      • central/external PEP
      • Embedded as part of the cloud app
      • using IDaaS
  • Authorization and Access Management
    • Authorization layer is likely a PDP (policy decision point)
    • Access management is the PEP layer
    • Should use open standards such as XACML to share authorization information
    • There are advantages to having the PDP implemented outside the cloud solution (and maybe within the traditional perimiter)
      • logging
      • central management
      • access to internal resources (DS, etc)
  • Architectures for Interfacing to Identity and Attribute Providers
    • Hub and Spoke
      • Good central control
      • Can be placed on-prem
      • abstracts the end-point services from the cloud provider (allowing control to dynamically add/change providers)
      • Gives the org a high degree of control
      • Hub can act as a cache to provide redundancy
    • Free-Form
      • easier to setup (from a cloud perspective)
      • makes it harder to provision a user and related attributes
      • makes for a more ad-hoc environment
      • hard to keep multiple cloud providers in sync
    • Hybrid
      • Combination of the above
      • Can lead to overly complex architectures
  • Level of Trust with Identity and Attributes
    • identities and attributes come with varied levels of trust
    • traditionally, orgs have maintained their own identities and attributes 
      • could potentially include many people who do not work for the org itself
    • during the entitlement phase it is important to understand the attributes required but also the trust level associated with that attribute
      • also a complex relationship between the strength of an attribute and the strength of the identity asserting them
  • Provisioning of Accounts on Cloud Systems
    • No widely used standard
      • SPML - Service Provisioning Markup Language
      • SCIM - Simple Cloud Identity Management
    • push model may not suffice
      • hard to maintain, non-standard, etc
    • need to consider the whole life cycle of an identity
      • disable / decommission
      • Also need to consider associated data
    • What about service accounts, etc, or identities for entities other than people
  • Identity as a Service
    • Will be covered in Domain 14
  • Compliance and Audit
    • PEP and PDP systems need to be configured to log according to regulatory requirements.
    • Audit trails generally need to be tied to an identity
  • Application Design for Identities
    • bolt-on to domain-10
    • Design goal should be to minimize the need for identity and attributes
    • Start from the principle that identification is not required
      • balance between on-the-fly account creation and a permanently identified account
    • Consider attribute derivation
      • Don't really need DOB, but need to know if entity is older than 18, for example
    • What unique identifier will be used? (external vs internal) and management thereof
  • Identity and Data Protection
    • make sure all applicable laws are being adhered to if PII or SPI are stored as part of a cloud application
    • Even if it is in the form of attributes on an identity
  • Consumerization and the Identity Challenge
    • Identity is hard with end-users and their devices
    • No easy way to have a user/device enrolled in a strong authentication system
    • Standards are fairly fragmented
  • Identity Service Providers
    • There is a fundamental level of trust issue when trusting other identity providers
    • Most established identity services only deal with user verification
      • Personal information is self-asserted
  • Recommendations
    • Read this, there is much info here
    • Federation Recommendations
      • Use open standards
      • understand the trustworthiness of the federated party and what they verify (attributes, etc)
    • Provisioning and Governance Recommendations
      • all attributes should be sourced as close to the master as possible
      • Cloud service/app should not be the master
      • attributes should always be linked to an identity
      • consider the life cycle of identities and attributes
    • Entitlement Recommendations
      • consider full life cycle
      • All parties should be identified and rules should be clearly defined
      • Entitlement rules should use the principle of least privs
    • Authorization and Access Recommendations
      • Use open standards (XACML, OASIS)
      • Consider logging for audit/compliance
    • Architecture Requirements
      • Cloud providers should have PDP/PEP that is configurable
      • use standard protocols
      • use OATH compliance authentication
    • Provisioning Recommendations
      • Use open standards (SPML, SCIM)
      • do not limit process to users, should include all entities
      • maintenance should be done on both identity and attributes
    • Identity Compliance and Audit Recommendations
      • ensure that logs from PEP/PDP are available 
      • log should include inputs to decision as well as the outcome
      • logs containing PII or SPI will be subject to data protection legislation
    • Application Design Recommendations
      • minimize need for Identity and Attributes in application design
      • cloud app should not be master for data
      • support SSO such as SAML and OAuth
      • Mutual authentication is critical
        • Cloud service should authenticate user
        • User should authenticate cloud service
    • Data Protection Recommendations
      • See domain 11 for more
      • Reduce the need to store PII or SPI data
      • Develop a process to deal with "Subject Access Requests"
    • Identity Implementation Recommendations
      • favor identity reuse rather than new enrolment
      • use Risk-Based authentication techniques


This was a pretty interesting domain, one that I think generally gets glossed over.  Identity and attributes surrounding identity is the premier "user input" of our day and age.  Maybe it always was, but until you start thinking about the controls in place around this sort of stuff you don't really understand the level of "faith" than can be placed in it.

Take, for example, employee on boarding.  Potential employees send out resumes.  If you think about it, they can (and often do) write whatever they want on it.  All the information on a resume should be treated as "self-asserted" because there is no confirmation.  This even comes down to name, address, etc and scales up to things like past experience.  Now, you could take a step to verify the name and address by asking for a valid government-issued photo id.  This step allows you to not only validate identity (via the photo) but also some of the attributes (ie: name, age, address).  Validating identity becomes interesting in this case because all we have to rely on is a photo and we assume that a photo is "generally unique".  I guess you could take this one further and look at the person who is validating the photo.  Is that person trained in this sort of thing?  Is the photo in color?  Is the photo recent enough?  I might put more trust (increase the strength of the identity) if a boarder security guard validated the photo from a color passport than I would an office worker validating from a black and white photo on a plastic card.  There are obviously other measures that can be taken here, but how many offices take then?

This goes even further, to other important areas such as work history.  Unless you called to confirm employment, you only have the self-asserted data to go off of.  I know of some companies that do do checks, but most don't.  This gets even more bizarre when we get to identities on the internet.  Take, for example, linked-in.  Anyone can add experience to their profile.  In fact, they can even select the company that they worked for from a list and have their image displayed.  Does the company get a chance to review the people who have claimed that they work for them?  I wonder....

After the employee is hired, does anyone check the attributes then?  I bet not.  They probably expect honesty and rely on the policy that action will be taken against people who do not tell the truth.  Now imagine being a 3rd party that is trying to allow trust to employees of this company?  Wow, mind = blown.

Back to the course content, it will be interesting to watch the developments in this area, especially around management of cloud identities, etc.  This seems to be a place where traditional IT is falling down rapidly.  Further to this, there is a lot of "rogue business departments" provisioning their own cloud resources, independent of IT.  The lack of standards in this area is unfortunate, but hopefully we can move in the right direction.