Informática, pregunta formulada por abelethio21, hace 1 año

How we ingest streaming data into Hadoop Cluster?​

Respuestas a la pregunta

Contestado por arinzenwodo7
1

Respuesta:

One of the most popular solutions for managing that flood of data involves ingesting the data into a Hadoop data lake. Consolidating all enterprise data into a single Hadoop data lake solves a number of problems, and offers some very attractive benefits


abelethio21: are you Shure this is truth answer this is assignment
arinzenwodo7: i thick so
abelethio21: do you have another or full answer
arinzenwodo7: A Java-based ingestion tool, Flume is used when input data streams-in faster than it can be consumed. Typically Flume is used to ingest streaming data into HDFS or Kafka topics, where it can act as a Kafka producer. Multiple Flume agents can also be used collect data from multiple sources into a Flume collector.
abelethio21: ok tnx are you student
arinzenwodo7: yes
Otras preguntas