Hadoop and distributed computing by Yahoo!

Apache Hadoop was invented by Doug Cutting (now a Yahoo! employee), is an open source Java software framework for running data-intensive applications on large clusters of commodity hardware. It relies on an active community of contributors from all over the world for its success. Hadoop implements two important elements. The first is a computational paradigm called Map/Reduce, which […]

Apache Hadoop was invented by Doug Cutting (now a Yahoo! employee), is an open source Java software framework for running data-intensive applications on large clusters of commodity hardware. It relies on an active community of contributors from all over the world for its success. Hadoop implements two important elements. The first is a computational paradigm called Map/Reduce, which takes an application and divides it into multiple fragments of work, each of which can be executed on any node in the cluster. The second is a distributed file system called HDFS. HDFS stores data on nodes in the cluster with the goal of providing greater bandwidth across the cluster, announced YDN Blog.

More infoHadoop

About The Author

Deepak Gupta is a IT & Web Consultant. He is the founder and CEO of diTii.com & DIT Technologies, where he's engaged in providing Technology Consultancy, Design and Development of Desktop, Web and Mobile applications using various tools and softwares. Sign-up for the Email for daily updates. Google+ Profile.