Hadoop and big data are no longer buzz words in large enterprises. Whether for the correct reasons or not, enterprise data warehouses are moving to Hadoop and along with it come petabytes of data. How do you ensure big data in Hadoop does not become a big problem or a big target. Vendors pitch their technologies as the magical silver bullet. However, did you realize that some controls are dependent on how many maps are available in the production cluster. What about the structure of the data being loaded? How much overhead does decryption operation add? If tokenizing data, how do you distinguish between in and original production data? However, in certain ways, Hadoop and big data represent a greenfield opportunity for security practitioners. It provides a chance to get ahead of the curve, test and deploy your tools, processes, patterns, and techniques before big data becomes a big problem.
Come join this session, where we walk through control frameworks we built and what we discovered, reinvented, polished, and developed to support data security, compliance, cryptographic protection, and effective risk management for sensitive data.