largecats' blog data engineer

Installing Hadoop, Spark, and Hive in Windows Subsystem for Linux (WSL)

2019-08-19

I want to set up Hadoop, Spark, and Hive on my personal laptop.

Method

Installation

  1. Set up WSL following this guide.
  2. Install Hadoop following this guide.
  3. Install Hive following this guide.
  4. Install Spark following this guide.

Start

  1. Open WSL terminal.
  2. Type
    sudo service ssh restart
    cd $HADOOP_HOME
    sbin/start-all.sh
    
  3. Open http://localhost:8088/cluster in browser to view resource manager. Note that only spark jobs submitted to yarn will show up here.

Similar Posts

Content