Heim > Datenbank > MySQL-Tutorial > Dataguise Presents 10 Best Practices for Securing

Dataguise Presents 10 Best Practices for Securing

WBOYWBOYWBOYWBOYWBOYWBOYWBOYWBOYWBOYWBOYWBOYWBOYWB
Freigeben: 2016-06-07 16:30:02
Original
976 Leute haben es durchsucht

Dataguise Presents 10 Best Practices for Securing Sensitive Data in Hadoop: Start Early! Determine the data privacy protection strategy during theplanning phase of a deployment, preferably before moving any data intoHadoop. This will preve

Dataguise Presents 10 Best Practices for Securing Sensitive Data in Hadoop:
  1. Start Early! Determine the data privacy protection strategy during the planning phase of a deployment, preferably before moving any data into Hadoop. This will prevent the possibility of damaging compliance exposure for the company and avoid unpredictability in the roll out schedule.

  2. Identify what data elements are defined as sensitive within your organization. Consider company privacy policies, pertinent industry regulations and governmental regulations.

  3. Discover whether sensitive data is embedded in the environment, assembled or will be assembled in Hadoop.

  4. Determine the compliance exposure risk based on the information collected.

  5. Determine whether business analytic needs require access to real data or if desensitized data can be used. Then, choose the right remediation technique (masking or encryption). If in doubt, remember that masking provides the most secure remediation while encryption provides the most flexibility, should future needs evolve.

  6. Ensure the data protection solutions under consideration support both masking and encryption remediation techniques, especially if the goal is to keep both masked and unmasked versions of sensitive data in separate Hadoop directories.

  7. Ensure the data protection technology used implements consistent masking across all data files (Joe becomes Dave in all files) to preserve the accuracy of data analysis across every data aggregation dimensions.

  8. Determine whether a tailored protection for specific data sets is required and consider dividing Hadoop directories into smaller groups where security can be managed as a unit. ?

  9. Ensure the selected encryption solution interoperates with the company’s access control technology and that both allow users with different credentials to have the appropriate, selective access to data in the Hadoop cluster.

  10. Ensure that when encryption is required, the proper technology (Java, Pig, etc.) is deployed to allow for seamless decryption and ensure expedited access to data.

Wait… where’s point 11, buy Dataguise?

Original title and link: Dataguise Presents 10 Best Practices for Securing Sensitive Data in Hadoop (NoSQL database?myNoSQL)

Dataguise Presents 10 Best Practices for Securing
Verwandte Etiketten:
Erklärung dieser Website
Der Inhalt dieses Artikels wird freiwillig von Internetnutzern beigesteuert und das Urheberrecht liegt beim ursprünglichen Autor. Diese Website übernimmt keine entsprechende rechtliche Verantwortung. Wenn Sie Inhalte finden, bei denen der Verdacht eines Plagiats oder einer Rechtsverletzung besteht, wenden Sie sich bitte an admin@php.cn
Beliebte Tutorials
Mehr>
Neueste Downloads
Mehr>
Web-Effekte
Quellcode der Website
Website-Materialien
Frontend-Vorlage