Get fresh updates from Hortonworks by email

Once a month, receive latest insights, trends, analytics, offering information and knowledge of the Big Data.

cta

Démarrer

cloud

Prêt à débuter ?

Télécharger Sandbox

Que pouvons-nous faire pour vous ?

fermerBouton Fermer

Utilisation d'IPython Notebook avec Apache Spark

Introduction

In this tutorial, we are going to configure IPython notebook with Apache Spark on YARN in a few steps.

IPython notebook is an interactive Python shell which lets you interact with your data one step at a time and also perform simple visualizations.

IPython notebook supports tab autocompletion on class names, functions, methods, variables. It also offers more explicit and colour-highlighted error messages than the command line python shell. It provides integration with basic UNIX shell allowing you can run simple shell commands such as cp, ls, rm, cp, etc. directly from the IPython. IPython integrates with many common GUI modules like PyQt, PyGTK, tkinter as well wide variety of data science Python packages.

Prerequisites

This tutorial is a part of series of hands-on tutorials to get you started with HDP using Hortonworks sandbox. Please ensure you complete the prerequisites before proceeding with this tutorial.

Note: This tutorial only works with HDP 2.4 Sandbox or earlier.

Installing and configuring IPython

To begin, login in to Hortonworks Sandbox through SSH:The default password is hadoop.

saptek

Now let’s configure the dependencies by typing in the following command:

yum install nano centos-release-SCL zlib-devel \
bzip2-devel openssl-devel ncurses-devel \
sqlite-devel readline-devel tk-devel \
gdbm-devel db4-devel libpcap-devel xz-devel \
libpng-devel libjpg-devel atlas-devel

saptek2

IPython has a requirement for Python 2.7 or higher. So, let’s install the “Development tools” dependency for Python 2.7

yum groupinstall "Development tools"

saptek3

Now we are ready to install Python 2.7.

yum install python27

saptek4

Now the Sandbox has multiple versions of Python, so we have to select which version of Python we want to use in this session. We will choose to use Python 2.7 in this session.

source /opt/rh/python27/enable

Next we will install pip using get-pip.py.

Download get-pip.py

wget https://bootstrap.pypa.io/get-pip.py

Install pip

python get-pip.py

pip makes it really easy to install the Python packages. We will use pip to install the data science packages we might need using the following commands:

pip install numpy scipy pandas \
scikit-learn tornado pyzmq \
pygments matplotlib jsonschema

and

pip install jinja2 --upgrade

saptek6

Finally, we are ready to install IPython notebook using pip using the following command:

pip install jupyter

Configuring IPython

Since we want to use IPython with Apache Spark we have to use the Python interpreter which is built with Apache Spark, pyspark, instead of the default Python interpreter.

As a first step of configuring that, let’s create a IPython profile for pyspark

ipython profile create pyspark

saptek9

Next we are going to create a shell script to set the appropriate values every time we want to start IPython.

Create a shell script with the following command:

nano ~/start_ipython_notebook.sh

Then copy the following lines into the file:

#!/bin/bash
source /opt/rh/python27/enable
IPYTHON_OPTS="notebook --port 8889 \
--notebook-dir='/usr/hdp/current/spark-client/' \
--ip='*' --no-browser" pyspark

Save and exit your editor.

Finally we need to make the shell script we just created executable:

chmod +x start_ipython_notebook.sh

python

Port Forwarding

We need to forward the port 8889 from the guest VM, Sandbox to the host machine, your desktop for IPython notebook to be accessible from a browser on your host machine.

Open the VirtualBox App and open the settings page of the Sandbox VM by right clicking on the Sandbox VM and selecting settings.

Then select the networking tab from the top:

saptek13

Then click on the port forwarding button to configure the port. Add a new port configuration by clicking the + icon on the top right of the page.

Input a name for application, IP and the guest and host ports as per the screenshot below:

saptek14

Then press OK to confirm the change in configuration.

Now we are ready to test IPython notebook.

Running IPython notebook

Execute the shell script we created before from the sandbox command prompt using the command below:

./start_ipython_notebook.sh

saptek15

Now, open a browser on your host machine and navigate to the URl http://127.0.0.1:8889 and you should see the screen below:

Voila! You have just configured IPython notebook with Apache Spark on you Sandbox.

In the next few tutorials we are going to explore how we can use IPython notebook to analyze and visualize data.