A Newbie’s Install of Keras & Tensorflow on Windows 10 with R

(This article was first published on R – Quality and Innovation, and kindly contributed to R-bloggers)

This weekend, I decided it was time: I was going to update my Python environment and get Keras and Tensorflow installed so I could start doing tutorials (particularly for deep learning) using R. Although I used to be a systems administrator (about 20 years ago), I don’t do much installing or configuring so I guess that’s why I’ve put this task off for so long. And it wasn’t unwarranted: it took me the whole weekend to get the install working. Here are the steps I used to get things running on Windows 10, leveraging clues in about 15 different online resources — and yes (I found out the hard way), the order of operations is very important. I do not claim to have nailed the order of operations here, but definitely one that works.

Step 0: I had already installed the tensorflow and keras packages within R, and had been wondering why they wouldn’t work. “Of course!” I finally realized, a few weeks later. “I don’t have Python on this machine, and both of these packages depend on a Python install.” Turns out they also depend on the reticulate package, so install.packages(“reticulate”) if you have not already.

Step 1: Installed Anaconda3 to C:/Users/User/Anaconda3 (from http://ift.tt/2vz2M3P)

Step 2: Opened “Anaconda Prompt” from Windows Start Menu. First, to create an “environment” specifically for use with tensorflow and keras in R called “tf-keras” with a 64-bit version of Python 3.5 I typed:

conda create -n tf-keras python=3.5 anaconda

… and then after it was done, I did this:

activate tf-keras

Step 3: Install TensorFlow from Anaconda prompt. Using the instructions at http://ift.tt/2pAiYS3 I typed this:

pip install --ignore-installed --upgrade

I didn’t know whether this worked or not — it gave me an error saying that it “can not import html5lib”, so I did this next:

conda install -c conda-forge html5lib

I tried to run the pip command again, but there was an error so I consulted http://ift.tt/2n1rp4s. It told me to do this:

pip install --ignore-installed --upgrade tensorflow

This failed, and told me that the pip command had an error. I searched the web for an alternative to that command, and found this, which worked!!

conda install -c conda-forge tensorflow

 

Step 4: From inside the Anaconda prompt, I opened python by typing “python”. Next, I did this, line by line:

import tensorflow as tf
 hello = tf.constant('Hello, TensorFlow!')
 sess = tf.Session()
 print(sess.run(hello))

It said “b’Hello, TensorFlow!’” which I believe means it works. (Ctrl-Z then Enter will then get you out of Python and back to the Anaconda prompt.) This means that my Python installation of TensorFlow was functional.

Step 5: Install Keras. I tried this:

pip install keras

…but I got the same error message that pip could not be installed or found or imported or something. So I tried this, which seemed to work:

conda install -c conda-forge keras

 

Step 6: Load them up from within R. First, I opened a 64-bit version of R v3.4.1 and did this:

library(tensorflow)
install_tensorflow(conda="tf=keras")

It took a couple minutes but it seemed to work.

library(keras)

 

Step 7: Try a tutorial! I decided to go for http://ift.tt/2r40Dz9 which guides you through developing a classifier for the MNIST handwritten image database — a very popular data science resource. I loaded up my dataset and checked to make sure it loaded properly:

data <- data_mnist()
str(data)
List of 2
 $ train:List of 2
 ..$ x: int [1:60000, 1:28, 1:28] 0 0 0 0 0 0 0 0 0 0 ...
 ..$ y: int [1:60000(1d)] 5 0 4 1 9 2 1 3 1 4 ...
 $ test :List of 2
 ..$ x: int [1:10000, 1:28, 1:28] 0 0 0 0 0 0 0 0 0 0 ...
 ..$ y: int [1:10000(1d)] 7 2 1 0 4 1 4 9 5 9 ...

 

Step 8: Here is the code I used to prepare the data and create the neural network model. This didn’t take long to run at all.

trainx<-data$train$x
trainy<-data$train$y
testx<-data$test$x
testy<-data$test$y

train_x <- array(train_x, dim = c(dim(train_x)[1], prod(dim(train_x)[-1]))) / 255

test_x <- array(test_x, dim = c(dim(test_x)[1], prod(dim(test_x)[-1]))) / 255

train_y<-to_categorical(train_y,10)
test_y<-to_categorical(test_y,10)

model %>% 
layer_dense(units = 784, input_shape = 784) %>% 
layer_dropout(rate=0.4)%>%
layer_activation(activation = 'relu') %>% 
layer_dense(units = 10) %>% 
layer_activation(activation = 'softmax')

model %>% compile(
loss = 'categorical_crossentropy',
optimizer = 'adam',
metrics = c('accuracy')
)

 

Step 9: Train the network. THIS TOOK ABOUT 12 MINUTES on a powerful machine with 64GB high-performance RAM. It looks like it worked, but I don’t know how to find or evaluate the results yet.

model %>% fit(train_x, train_y, epochs = 100, batch_size = 128)
 loss_and_metrics <- model %>% evaluate(test_x, test_y, batch_size = 128)

str(model)
Model
___________________________________________________________________________________
Layer (type) Output Shape Param #
===================================================================================
dense_1 (Dense) (None, 784) 615440
___________________________________________________________________________________
dropout_1 (Dropout) (None, 784) 0
___________________________________________________________________________________
activation_1 (Activation) (None, 784) 0
___________________________________________________________________________________
dense_2 (Dense) (None, 10) 7850
___________________________________________________________________________________
activation_2 (Activation) (None, 10) 0
===================================================================================
Total params: 623,290
Trainable params: 623,290
Non-trainable params: 0

 

Step 10: Next, I wanted to try the tutorial at http://ift.tt/2pBOCvv. Turns out this uses the kerasR package, not the keras package:

X_train <- mnist$X_train
Y_train <- mnist$Y_train
X_test <- mnist$X_test
Y_test <- mnist$Y_test

> dim(X_train)
[1] 60000 28 28

X_train <- array(X_train, dim = c(dim(X_train)[1], prod(dim(X_train)[-1]))) / 255
X_test <- array(X_test, dim = c(dim(X_test)[1], prod(dim(X_test)[-1]))) / 255

To check and see what’s in any individual image, type:

image(X_train[1,,])

At this point, the to_categorical function stopped working. I was supposed to do this but got an error:

Y_train <- to_categorical(mnist$Y_train, 10)

So I did this instead:

mm <- model.matrix(~ Y_train)

Y_train <- to_categorical(mm[,2])

mod <- Sequential()  # THIS IS THE EXCITING PART WHERE YOU USE KERAS!! :)

But then I tried this, and it was clear I was stuck again — it wouldn’t work:

mod$add(Dense(units = 512, input_shape = dim(X_train)[2]))

Stack Overflow recommended grabbing a version of kerasR from GitHub, so that’s what I did next:

install.packages("devtools")
library(devtools)
devtools::install_github("statsmaths/kerasR")
library(kerasR)

I got an error in R which told me to go to the Anaconda prompt (which I did), and type this:

conda install m2w64-toolchain

Then I went back into R and this worked fantastically:

mod <- Sequential()

mod$Add would still not work though, and this is where my patience expired for the evening. I’m pretty happy though — Python is up, keras and tensorflow are up on Python, all three (keras, tensorflow, and kerasR) are up in R, and some tutorials seem to be working.

To leave a comment for the author, please follow the link and comment on their blog: R – Quality and Innovation.

R-bloggers.com offers daily e-mail updates about R news and tutorials on topics such as: Data science, Big Data, R jobs, visualization (ggplot2, Boxplots, maps, animation), programming (RStudio, Sweave, LaTeX, SQL, Eclipse, git, hadoop, Web Scraping) statistics (regression, PCA, time series, trading) and more...
Data http://ift.tt/2zrwWZe October 16, 2017 at 08:31AM

Comments

Popular posts from this blog

How Blockchain Will Redefine Social Media Marketing (And How to Prepare)

.rprofile: David Smith

CEO Daily: Wednesday, 12th July