Install Hadoop On Windows Single Node Cluster – Part 1

The Hortonworks Data Platform, powered by Apache Hadoop, is a massively scalable and 100% open source platform for storing, processing and analyzing large volumes of data. It is designed to deal with data from many sources and formats in a very quick, easy and cost-effective manner. The Hortonworks Data Platform consists of the essential set of Apache Hadoop projects including MapReduce, Hadoop Distributed File System (HDFS), HCatalog, Pig, Hive, HBase, Zookeeper and Ambari. Hortonworks is the
major contributor of code and patches to many of these projects. These projects have been integrated and tested as part of the Hortonworks Data Platform release process and installation and configuration tools have also been included.

Hortonworks Data Platform 2.3 represents yet another major step forward for Hadoop as the enterprise data platform. This release incorporates the most recent innovations that have happened in Hadoop and its supporting ecosystem of projects. HDP 2.3 packages more than a hundred new features across all our existing projects. Every component is updated and we have added some key technologies and capabilities to HDP 2.3.

This tutorial talks about installing hadoop (HDP) on Windows servers.