newsletter

Obtenez par e-mail toute l'actualité Hortonworks

Une fois par mois, recevez les dernières idées, tendances, informations d’analyse et découvertes sur le Big Data.

AVAILABLE NEWSLETTERS:

Sign up for the Developers Newsletter

Une fois par mois, recevez les dernières idées, tendances, informations d’analyse et découvertes sur le Big Data.

cta

Démarrer

cloud

Prêt à débuter ?

Télécharger Sandbox

Que pouvons-nous faire pour vous ?

* Je comprends que je peux me désabonner à tout moment. J'ai également compris les informations supplémentaires fournies dans la Politique de confidentialité de Hortonworks.
fermerBouton Fermer
cta

Hortonworks Data Lifecycle Manager

Protect Your Enterprise Data On-Premises & In-Cloud Through Hadoop Replication

Taking a modern approach to managing your data

Télécharger le livre blanc

Présentation

Data Lifecycle Manager (DLM) is a DataPlane application that protects not only your data but also the security policies associated with it through replication. This application empowers system administrators to replicate HDFS and Hive data from on-premises cluster to cloud storage. Replication of the Hive database from a cluster with underlying HDFS to another cluster with cloud storage is supported. DLM protects Data-at-Rest (TDE) and Data-in-Motion (TLS) and provides support for multiple key management service (KMS) and encryption keys.

Data Lifecycle Manager video imgbouton de la vidéo

Data Lifecycle Manager

Avantages

Protège les ressources critiques pour les données

DLM provides replication of HDFS and Hive data from on-premises cluster to cloud storage. DLM provides a web UI that administrators can use to create and manage replication and disaster recovery policies and jobs. Avoid unnecessarily copying renamed files and directories and protect the data against accidental or intentional modification to meet governance and disaster recovery (DR) requirements. DLM enables system administrators to:

  • Incrementally replicate Hive data and metadata
  • Répliquez les données entre les clusters HDP à l'aide des snapshots HDFS.
  • Provide support for data-at-rest (TDE) and data-in-motion (TLS) encryption
  • Accès impossible sans autorisation aux données et prise en charge de l'isolement des tâches
  • Configure the destination cluster to serve as the new source, if the source cluster becomes unavailable
Webinar: Global Data Management In A Multi-Cloud Hybrid World
Vision complète
Replicate security policies associated with data

Replicate not only data but also the metadata and security policies that have been associated with data. DLM enhances the productivity of system administrators by:

  • Exporting Apache Ranger policies for the HDFS directory from source Ranger service and replicating them to destination Ranger service
  • Replicating associated file metadata, table structures or schemas
  • Providing active/standby behavior or DR site using Ranger policies
Blog: Painless Disaster Recovery using Hortonworks Data Lifecycle Manager
Vision complète
Implement hybrid data replication

DLM supports replication of HDFS and Hive data between underlying HDFS and AWS S3 cloud storage. DLM provides administrators with:

  • Réplication de données bi-directionnelle entre les environnements cloud et sur site
  • Flexibility to designate either cluster in a pair to serve as the source or as the destination in a replication policy
  • Réplication native du stockage cloud dans les compartiments S3
  • Seamless integration between AWS-cloud and DLM for data and security policy replication
Blog: Data Replication in Hadoop
Vision complète
Get visibility into cluster status and automate replication tasks for enhanced productivity

Quickly identify any issues or verify the health of the clusters, policies, or jobs in DLM. View the total number of clusters enabled for DLM, the number for which all or some of the services are running, and the number of clusters for which remaining disk capacity is less than 10%. DLM provides system administrators with the flexibility to:

  • Create policies based on business rules
  • Replicate data based on data sets, day and time, the frequency of job execution and bandwidth restrictions
Blog: A Step-by-Step Guide for HDFS Replication
Vision complète