Home > Database > Mysql Tutorial > Dataguise Presents 10 Best Practices for Securing

Dataguise Presents 10 Best Practices for Securing

WBOYWBOYWBOYWBOYWBOYWBOYWBOYWBOYWBOYWBOYWBOYWBOYWB
Release: 2016-06-07 16:30:02
Original
972 people have browsed it

Dataguise Presents 10 Best Practices for Securing Sensitive Data in Hadoop: Start Early! Determine the data privacy protection strategy during theplanning phase of a deployment, preferably before moving any data intoHadoop. This will preve

Dataguise Presents 10 Best Practices for Securing Sensitive Data in Hadoop:
  1. Start Early! Determine the data privacy protection strategy during the planning phase of a deployment, preferably before moving any data into Hadoop. This will prevent the possibility of damaging compliance exposure for the company and avoid unpredictability in the roll out schedule.

  2. Identify what data elements are defined as sensitive within your organization. Consider company privacy policies, pertinent industry regulations and governmental regulations.

  3. Discover whether sensitive data is embedded in the environment, assembled or will be assembled in Hadoop.

  4. Determine the compliance exposure risk based on the information collected.

  5. Determine whether business analytic needs require access to real data or if desensitized data can be used. Then, choose the right remediation technique (masking or encryption). If in doubt, remember that masking provides the most secure remediation while encryption provides the most flexibility, should future needs evolve.

  6. Ensure the data protection solutions under consideration support both masking and encryption remediation techniques, especially if the goal is to keep both masked and unmasked versions of sensitive data in separate Hadoop directories.

  7. Ensure the data protection technology used implements consistent masking across all data files (Joe becomes Dave in all files) to preserve the accuracy of data analysis across every data aggregation dimensions.

  8. Determine whether a tailored protection for specific data sets is required and consider dividing Hadoop directories into smaller groups where security can be managed as a unit. ?

  9. Ensure the selected encryption solution interoperates with the company’s access control technology and that both allow users with different credentials to have the appropriate, selective access to data in the Hadoop cluster.

  10. Ensure that when encryption is required, the proper technology (Java, Pig, etc.) is deployed to allow for seamless decryption and ensure expedited access to data.

Wait… where’s point 11, buy Dataguise?

Original title and link: Dataguise Presents 10 Best Practices for Securing Sensitive Data in Hadoop (NoSQL database?myNoSQL)

Dataguise Presents 10 Best Practices for Securing
Related labels:
Statement of this Website
The content of this article is voluntarily contributed by netizens, and the copyright belongs to the original author. This site does not assume corresponding legal responsibility. If you find any content suspected of plagiarism or infringement, please contact admin@php.cn
Popular Tutorials
More>
Latest Downloads
More>
Web Effects
Website Source Code
Website Materials
Front End Template