Alphacast Friday Update - January 21st

What a week! As we've described in the recent Friday update, we are focusing our efforts on two fronts: More Data and a more powerful Pipeline Engine. Let us describe where we are standing and where we are heading

New premium Publisher on the marketplace.


We are excited to introduce you to our new Premium publisher providing A LOT of new content for Uruguay,with 60 public datasets and lots of content for a subscription. You can subscribe to their content and receive daily updates of more than 50 private datasets for just USD 20 per month following this link

Lots of new global data

There are hundreds of new datasets in Alphacast, and the pace of the addition of new content is accelerating. We look forward to adding lots of new content in the coming weeks.

With so much data comes great responsibility. We understand that entropy will tend to disorganize the content in Alphacast so we are doing efforts to improve the overall search and explore experience. We have some ideas that we plan to be exploring in the coming weeks such as:

  • A new "Netflix like" home for the explore section to organize the data with categories, trending datasets, What’s new (from your network? In Alphacast?), - People section, to discover what other users are creating
  • Related content based on the user activity (you may be interested in) and on the dataset or the chart that you are seeing
  • Filter by tags on insights, filter by users, and more.
  • Improved metadata of the content
  • Language standardization and multilanguage support

New pipelines engine is on beta testing.

The new pipelines engine is on beta testing. We are working to stabilize the overall experience and make a publicly available version soon.

We have to be cautious in not overpromising, but our aim with the new pipelines engine is to have a much more powerful full and flexible infrastructure to process the data within Alphacast and, eventually, to let the users program their own custom transformations in Python.

Our goal is to go beyond basic transformations or arithmetic on the data and include transformations such as Smoothing & Filters, complex unit conversions, econometrics models and statistics, and more (i.e. finance-related algorithms). To catch a glimpse on our road map for the coming month we hope to be adding the following steps:

  • Basic transforms: Filter rows, Filter columns, Resample frequency, Group by, Pivot, Melt, Merg, Shift (lead and lag)
  • Basic Arithmetics: Variation, Difference, X to date sum, change and average and custom formulas.
  • Smoothing & Filters: Seasonal adjustment X-12, Moving average, Running sum, Hodrick-Prescott Filters, Kalman filter, Data Normalization and remove Holidays
  • Unit conversion: % of GDP, USD / LCU, USD (BCS) / LCU, Constant Prices, Per Capita or PPP conversion
  • Models and related: Linear Regression, ARIMAs, Principal Component Analysis, Recurrent Neural Network, and VARs and VECs
  • Statistics and Tests: Basic Summary Stats, Covariance Matrix, Histograms, Granger Causality, Johanson's Cointegration Test, Augmented Dickey-Fuller Test (ADF Test) - Stationarity, Durbin Watson Statistic, Rank & Percentiles.
  • Finance: Nelson Siegel Svensson and splines o cubic splines for yield curve interpolation.

As you can see, this is lots of ground to cover! Looking forward to hearing your toughs

Related insights