Calling NeuralProphet from R: Adventures in Reticulate
Recently I saw that Facebook released Neural Prophet, a new forecasting package similar to Prophet, but built on top of Torch. Prophet is one of my favorite forecasting packages, given the ability to decompose forecasts, add in events and holidays, and take advantage of business user domain knowledge. Naturally, I was excited about hearing this new version, and on top of torch of all things! The package itself is early in development, so there’s obviously no R port yet. I saw this as an opportunity to take reticulate out for a spin.
It was super easy!
Just a few years ago, using cutting edge python packages in R would have been a pipe dream, but given the work that has gone into both packages, it was incredibly simple for an individual developer like myself.
What I was to do in this post is walk through a simple example with it in order to give a refresher for anyone new to reticulate, call out some of the features that make reticulate especially easy, and show off the forecasting ability of neural prophet.
Setup
Before jumping into R, you need to make sure you have a python environment with the proper packages installed. In my case, I created a conda environment called neural_prophet
. We’ll keep it simple and just install the R packages reticulate
and tidyverse
Loading a python package is simple, just use the import
command and assign it to an object
Again, for sake of simplicity, we’ll use an example dataset from the package, a monthly retail sales dataset that spans 1992-2016.
Model
Now we’ll initialize the model with the NeuralProphet()
function from the neuralprophet
package we imported above, and fit the model to the training dataset.
What’s understated and really cool here is the automatic conversion between an R tibble/data frame and a pandas data frame. Zero work on our part.
Predict
One awesome feature from prophet
that carries over to neuralprophet
is the make_future_dataframe
function, where you can specify how many periods you want to go in the future, and it’ll automatically create a dataframe that the forecasting model can predict off of.
ds | y | yhat1 | residual1 | trend | season_yearly |
---|---|---|---|---|---|
2016-05-30 17:00:00 | 456699.4 | NaN | 449847.4 | 6851.9561 | |
2016-06-29 17:00:00 | 453226.5 | NaN | 450816.4 | 2410.0474 | |
2016-07-30 17:00:00 | 465718.1 | NaN | 451817.8 | 13900.3096 | |
2016-08-30 17:00:00 | 435408.9 | NaN | 452819.1 | -17410.1816 | |
2016-09-29 17:00:00 | 452465.1 | NaN | 453788.2 | -1323.0161 | |
2016-10-30 17:00:00 | 449587.3 | NaN | 454789.4 | -5202.1797 | |
2016-11-29 16:00:00 | 508702.4 | NaN | 455758.5 | 52943.9062 | |
2016-12-30 16:00:00 | 424294.4 | NaN | 456759.8 | -32465.3730 | |
2017-01-30 16:00:00 | 424569.4 | NaN | 457761.2 | -33191.7422 | |
2017-02-27 16:00:00 | 458540.3 | NaN | 458665.6 | -125.3058 | |
2017-03-30 17:00:00 | 458074.8 | NaN | 459666.9 | -1592.1603 | |
2017-04-29 17:00:00 | 472146.8 | NaN | 460636.0 | 11510.7402 |
Obviously we’re predicting the future here, so we don’t have true y values to compare against, but we can see that if we wanted to evaluate our forecast on known data, prophet can easily do that.
Again, to reiterate, it’s pretty sweet that I don’t have to worry about type conversions here, as reticulate automatically converts R data frames to pandas data frames.
From here, we can make our plots and evaluation in R, like we would with any other forecast.
Summary
I know this post was pretty short, but I thought this was actually a pretty powerful example of the power of reticulate. At the time of writing, neuralprophet
came out a few weeks ago, and I was able to get it up and working in R in a few minutes.
It’s easy to think that state of the art Python libraries are out of R’s scope, at least until some generous soul creates an R package so us folk can use it. But that’s not the case! Obviously I hope me or some other kind soul gets the time to package this up and make it easier for R users to take advantage, but it’s important to know that if you know a little bit of Python, you can take advantage of these packages on day one.
So don’t wait on the generosity of others, get in there and get your hands dirty and who knows, you might be the generous soul that makes state of the art data science available to the masses.