Big-Data
December 9, 2023Relational-databases
December 9, 2023Big-Data
|
Question 6
|
Which of the following statement/s is/are true ?
(i) Facebook has the world’s largest Hadoop cluster.
(ii) Hadoop 2.0 allows live stream processing of real time data
(i) Facebook has the world’s largest Hadoop cluster.
(ii) Hadoop 2.0 allows live stream processing of real time data
|
Neither (i) nor (ii)
|
|
|
Both (i) and (ii)
|
|
|
(i) only
|
|
|
(ii) only
|
Question 6 Explanation:
→ The Data warehouse Hadoop cluster at Facebook has become the largest known Hadoop storage cluster in the world.
Here are some of the details about this single HDFS cluster:
1. 21 PB of storage in a single HDFS cluster
2. 2000 machines
3. 12 TB per machine (a few machines have 24 TB each)
4. 1200 machines with 8 cores each + 800 machines with 16 cores each
5. 32 GB of RAM per machine
6. 15 map-reduce tasks per machine
That’s a total of more than 21 PB of configured storage capacity! This is larger than the previously known Yahoo!’s cluster of 14 PB.
→ Hadoop 2.0 allows live stream processing of real time data
Here are some of the details about this single HDFS cluster:
1. 21 PB of storage in a single HDFS cluster
2. 2000 machines
3. 12 TB per machine (a few machines have 24 TB each)
4. 1200 machines with 8 cores each + 800 machines with 16 cores each
5. 32 GB of RAM per machine
6. 15 map-reduce tasks per machine
That’s a total of more than 21 PB of configured storage capacity! This is larger than the previously known Yahoo!’s cluster of 14 PB.
→ Hadoop 2.0 allows live stream processing of real time data
Correct Answer: B
Question 6 Explanation:
→ The Data warehouse Hadoop cluster at Facebook has become the largest known Hadoop storage cluster in the world.
Here are some of the details about this single HDFS cluster:
1. 21 PB of storage in a single HDFS cluster
2. 2000 machines
3. 12 TB per machine (a few machines have 24 TB each)
4. 1200 machines with 8 cores each + 800 machines with 16 cores each
5. 32 GB of RAM per machine
6. 15 map-reduce tasks per machine
That’s a total of more than 21 PB of configured storage capacity! This is larger than the previously known Yahoo!’s cluster of 14 PB.
→ Hadoop 2.0 allows live stream processing of real time data
Here are some of the details about this single HDFS cluster:
1. 21 PB of storage in a single HDFS cluster
2. 2000 machines
3. 12 TB per machine (a few machines have 24 TB each)
4. 1200 machines with 8 cores each + 800 machines with 16 cores each
5. 32 GB of RAM per machine
6. 15 map-reduce tasks per machine
That’s a total of more than 21 PB of configured storage capacity! This is larger than the previously known Yahoo!’s cluster of 14 PB.
→ Hadoop 2.0 allows live stream processing of real time data
