Skip to content

Latest commit

 

History

History
49 lines (30 loc) · 1.84 KB

tutorial-classfication.md

File metadata and controls

49 lines (30 loc) · 1.84 KB
layout title permalink
doc
Data Classification Tutorial
/docs/tutorial/classification.html

Eagle Data classification provides the ability to classify HDFS and Hive data with different levels of sensitivity. For both HDFS and Hive, a user can browse the resources and add/remove the sensitivity information.

The document has two parts. The first part is about how to add/remove sensitivity to files/directories; the second part shows the application of sensitivity in policy definition. Showing HDFS as an example.

WARNING: sensitivity is classified by sites. Please select the right site first when there are multiple ones.

Part 1: Sensitivity Edit

  • add the sensitive mark to files/directories.

    • Basic: Label sensitivity files directly (recommended)

      HDFS classification HDFS classification HDFS classification

    • Advanced: Import json file/content

      HDFS classification HDFS classification HDFS classification

  • remove sensitive mark on files/directories

    • Basic: remove label directly

      HDFS classification HDFS classification

    • Advanced: delete lin batch

      HDFS classification

Part 2: Sensitivity Usage in Policy Definition

You can mark a particular folder/file as "PRIVATE". Once you have this information you can create policies using this label.

For example: the following policy monitors all the operations to resources with sensitivity type "PRIVATE".

sensitivity type policy