UrbanPro

Learn Hadoop from the Best Tutors

  • Affordable fees
  • 1-1 or Group class
  • Flexible Timings
  • Verified Tutors

Search in

Does creating checkpoints in various phases (map, combiner, shuffle, reduce) of a map reduce the chance to avoid phase level failures and improve fault tolerance?

Asked by Last Modified  

Follow 1
Answer

Please enter your answer

Big Data Architect

Failure detection and recovery in hadoop happens at a task level. If a task fails, it is rerun on another available node. This requires recomputing the task from scratch, which for a long running task adds up a lot of delay to the application. A common thought of solution is checkpointing the latest...
read more
Failure detection and recovery in hadoop happens at a task level. If a task fails, it is rerun on another available node. This requires recomputing the task from scratch, which for a long running task adds up a lot of delay to the application. A common thought of solution is checkpointing the latest state of the program in a stable storage, and resume the program for the last saved checkpoint , in case a failure happens. Unfortunately, this type of checkpointing in hadoop is not as simple as that. The reasons being - Checkpointing requires the intermediate state to be saved on a stable storage. For an application like mapreduce, which produces a lot of intermediate results, this would severely affect the performance of the application. Maintaining checkpoint information requires a lot of bandwidth and resources, which again is a bottleneck for mapreduce applications. Recovering a task from a checkpoint also requires a lot of IO and network, which would affect our applications. Saving the intermediate results at various phases in mapreduce is a good idea, but it has to be implemented thoughtfully. First, local checkpointing should be utilized for this purpose, i.e local disk of the tasks should be used to write task progress. No replicas should be transferred over the network, to avoid congestion. This should be done at the time the tasks were going to write intermediate results to disk anyway, otherwise it would create delays. Second, instead of blindly recovering the intermediate data from checkpoints, some important changes can be made, like increasing the memory size and split size, before recovering, owing to the fact that a lot of task failures happen due to heap space issues. This would prevent the tasks from continuously failing and re-running. Owing to the challenges, it would take some time for a proper phase level checkpointing algorithm to be implemented in hadoop mapreduce. Also, if there are not many failures in our job, traditional checkpointing would actually work better than this approach. read less
Comments

Related Questions

what should I know before learning hadoop?
It depends on which stream of Hadoop you are aiming at. If you are looking for Hadoop Core Developer, then yes you will need Java and Linux knowledge. But there is another Hadoop Profile which is in demand...
Tina
Do I need to learn the Java-Hibernate framework to be a Hadoop developer?
Not At All . To be Hadoop Developer , you need the knowledge of basic core Java programming along with SQL . No one will ask any question in interview on hibernate .
Pritam
0 0
6
A friend of mine asked me which would be better, a course on Java or a course on big data or Hadoop. All I could manage was a blank stare. Do you have any ideas?
Start with java , even to go with big data one need to have good knowledge of one programming language
Srikumar
0 0
5
I want to learn Hadoop admin.
Hi Suresh, I am providing hadoop administration training which will lead you to clear the Cloudera Administrator Certification exam (CCA131). You can contact me for course details. Regards Biswanath
Suresh

Now ask question in any of the 1000+ Categories, and get Answers from Tutors and Trainers on UrbanPro.com

Ask a Question

Related Lessons

Hadoop Development Syllabus
Hadoop 2 Development with Spark Big Data Introduction: What is Big Data Evolution of Big Data Benefits of Big Data Operational vs Analytical Big Data Need for Big Data Analytics Big...

How to change a managed table to external
ALTER TABLE <table> SET TBLPROPERTIES('EXTERNAL'='TRUE') This above property will change a managed table to an external table

Rahul Sharma

0 0
0

Why is the Hadoop essential?
Capacity to store and process large measures of any information, rapidly. With information volumes and assortments always expanding, particularly from web-based life and the Internet of Things (IoT), that...

Big Data
Bigdata Large amount of data and data may be various types such as structured, unstructured, and semi-structured, the data which cannot processed by our traditional database applications are not enough....

Understanding Big Data
Introduction to Big Data This blog is about Big Data, its meaning, and applications prevalent currently in the industry.It’s an accepted fact that Big Data has taken the world by storm and has become...
M

MyMirror

0 0
0

Recommended Articles

Hadoop is a framework which has been developed for organizing and analysing big chunks of data for a business. Suppose you have a file larger than your system’s storage capacity and you can’t store it. Hadoop helps in storing bigger files than what could be stored on one particular server. You can therefore store very,...

Read full article >

In the domain of Information Technology, there is always a lot to learn and implement. However, some technologies have a relatively higher demand than the rest of the others. So here are some popular IT courses for the present and upcoming future: Cloud Computing Cloud Computing is a computing technique which is used...

Read full article >

Big data is a phrase which is used to describe a very large amount of structured (or unstructured) data. This data is so “big” that it gets problematic to be handled using conventional database techniques and software.  A Big Data Scientist is a business employee who is responsible for handling and statistically evaluating...

Read full article >

We have already discussed why and how “Big Data” is all set to revolutionize our lives, professions and the way we communicate. Data is growing by leaps and bounds. The Walmart database handles over 2.6 petabytes of massive data from several million customer transactions every hour. Facebook database, similarly handles...

Read full article >

Looking for Hadoop ?

Learn from the Best Tutors on UrbanPro

Are you a Tutor or Training Institute?

Join UrbanPro Today to find students near you
X

Looking for Hadoop Classes?

The best tutors for Hadoop Classes are on UrbanPro

  • Select the best Tutor
  • Book & Attend a Free Demo
  • Pay and start Learning

Learn Hadoop with the Best Tutors

The best Tutors for Hadoop Classes are on UrbanPro

This website uses cookies

We use cookies to improve user experience. Choose what cookies you allow us to use. You can read more about our Cookie Policy in our Privacy Policy

Accept All
Decline All

UrbanPro.com is India's largest network of most trusted tutors and institutes. Over 55 lakh students rely on UrbanPro.com, to fulfill their learning requirements across 1,000+ categories. Using UrbanPro.com, parents, and students can compare multiple Tutors and Institutes and choose the one that best suits their requirements. More than 7.5 lakh verified Tutors and Institutes are helping millions of students every day and growing their tutoring business on UrbanPro.com. Whether you are looking for a tutor to learn mathematics, a German language trainer to brush up your German language skills or an institute to upgrade your IT skills, we have got the best selection of Tutors and Training Institutes for you. Read more