Hadoop and distributed computing by Yahoo!

Apache Hadoop was invented by Doug Cutting (now a Yahoo! employee), is an open source Java software framework for running data-intensive applications on large clusters of commodity hardware. It relies on an active community of contributors from all over the world for its success. Hadoop implements two important elements. The first is a computational paradigm called Map/Reduce, which […]

Apache Hadoop was invented by Doug Cutting (now a Yahoo! employee), is an open source Java software framework for running data-intensive applications on large clusters of commodity hardware. It relies on an active community of contributors from all over the world for its success. Hadoop implements two important elements. The first is a computational paradigm called Map/Reduce, which takes an application and divides it into multiple fragments of work, each of which can be executed on any node in the cluster. The second is a distributed file system called HDFS. HDFS stores data on nodes in the cluster with the goal of providing greater bandwidth across the cluster, announced YDN Blog.

More infoHadoop