Importance of hadoop in big data handling​ ​ - PowerPoint PPT Presentation

About This Presentation
Title:

Importance of hadoop in big data handling​ ​

Description:

There is a change in the understanding of Hadoop for handling Big Data especially the data which is unstructured. Bigdata handling is done by Apache Hadoop software library. – PowerPoint PPT presentation

Number of Views:10

less

Transcript and Presenter's Notes

Title: Importance of hadoop in big data handling​ ​


1
Importance of Hadoop in Big Data Handling
2
There is a change in the understanding of Hadoop
for handling Big Data especially the data which
is unstructured. Bigdata handling is done by
Apache Hadoop software library. Lots of data can
be streamlined by Apache Hadoop for distributed
processing system among a group of computers
using simple programming models. For storage
space and local computation it is made to evolve
single servers to a large number of machines and
storage space.
More...
Stay Connected For More Updates
3
This Is What Hadoop Is Made Up Of
Source code, documentation and a contribution
section
A MapReduce engine (either MapReduce or YARN)
The Hadoop Distributed Data file System (HDFS)
Java ARchive (JAR) files
File system and OS level abstractions
Scripts needed to start Hadoop
More...
Stay Connected For More Updates
4
Activities Performed On Big Data
Store Big data need to be gathered in a
seamless data base, and it is not mandatory to
have a single physical data as a storage.
Process The procedure becomes more boring than
traditional one in terms of enriching, cleansing,
transforming, changing, and running methods.
More...
Stay Connected For More Updates
5
Hadoop Distributed FileSystem (HDFS)
    HDFS is meant to run on product components.
It stores huge data     files typically in GB to
TB among various devices. HDFS offers     data
attention between task tracking program and job
tracking     program. The job tracking program
plans help in reducing tasks     to process
trackers with data location knowledge. This
makes       easier the procedure of Data
management. The two main parts of     Hadoop are
Data processing framework and HDFS.
More...
Stay Connected For More Updates
6
Other Things
During Hardware Failure A goal with core
architecture of HDFS is recognition of faults and
quick, automated restoration from them.
Need Streaming Data Access To run the software
HDFS is developed more for processing the batch
rather than entertaining use by users for
streaming their data sets.
Designed for Large Data Sets For supporting
large files and providing big aggregation of
bandwidth in data and scaling many nodes in a
single cluster.
More...
Stay Connected For More Updates
7
Thank You
  • It is easy to become a DBA Professional by
    joining the DBA Training Course to make your
    career in this field.
  • Stay connected to CRB Tech for more technical
    optimization and other updates and information.

Stay Connected For More Updates
Write a Comment
User Comments (0)
About PowerShow.com