team logo

Exploring Alphacast [Outdated]

By Alphacast

Discover the new "Exploring Alphacast" Repository here

https://www.alphacast.io/repositories/1862/insights

Charts

    See more

    Insights

    Insights

    Importing data into Excel

    chart

    Excel allows to add data from different sources. We offer our Alphacast Excel Add-In to simplify the process of downloading data. However, some customers may not have the chance to install an Add-In. Here you can find an alternative way to embed data into Excel, by using our TSV data source: Getting TSV download links When you navigate through Datasets in Alphacast, you will find a Download link. If you filter variables, download links will consider variables selection, and only filtered variables will be downloaded. You will now find an option to download Tab Separated Values (TSV), which can make it easier to embed data into Excel: Right click on the TSV Download link, and choose "Copy Link Address". You will notice the URL already has your personal api_key, and should not be shared publicly with other users, outside your team. Now go to Excel, and select Get Data -> From Other Sources -> From Web, and enter the URL you copied from Alphacast website. The Excel team published a super easy video, describing how to do...

    ¿Cómo unir el contenido de dos datasets?

    chart

    Seguramente en tu trabajo habitual con datos necesitaste cruzar varias fuentes de datos y si tu herramienta de cálculo es Excel posiblemente lo resuelvas con alguna combinación de las formulas VLOOKUP, HLOOKUP y/o MATCH. Excel es una gran solución en muchos casos, pero suele traer dificultades en algunos escenarios. Por ejemplo, cuando... ...tenés MUCHAS filas. VLOOKUP puede tener problemas de performance y ser muy lento ...necesitás buscar mas de un campo para combinar los datos ...cambia la posición de las filas o las columnas ...solo necesitás los datos que estén en los dos conjuntos de datos ...alguna de las fuentes de datos cambio de cantidad de filas y tenés que copiar o ajustar las formulas. Con Alphacast podés utilizar los pipelines para combinar datasets y mantenerlos conectados. Paso 1.Elegir una fuente de datos Para hacer merge dos datasets primero hay que dirigirse al botón Create new y elegir pipeline v2.0. Una vez allí, seleccionar el repositorio donde se guardará el pipeline y escribir el nombre deseado. En Fetch dataset seleccionar el dataset requerido. Apretar el botón Save. Paso 2. Seleccionar la fuente de datos a "Mergear" Luego clickear Add step below y elegir la opción Merge with Dataset, ahí se...

    Alphacast's Integrations: Interacting with the API on R

    Integrating Alphacast with R (YOU ARE HERE) Integrating Alphacast with Python How to install the Excel add-in Download data with the Excel add-in Introduction and prerequisites Getting data from Alphacast with R is really easy. You need the Alphacast API Key and some common R packages. With a few simple steps, you can get entire dataframes, dataset indexes, repository names and repository content in any format, ready for processing and analysis. To work correctly with the Alphacast API from R we recommend installing and loading the following libraries: install.packages(c("dplyr", "httr", "reshape2")) library(dplyr) library(httr) library(reshape2) To make your API workflow easier, we recommend creating an object named "alphacastapikey" with your own key. This will make working with your Alphacast credentials faster. Remember that you can get the API credentials from the Alphacast Settings menu. For example: alphacastapikey <- "YOUR_API KEY" Getting all available datasets in Alphacast Before starting to work with the API, you may find it useful to have an index with all the datasets available on the platform with your user level. With a few lines of code to achieve this is possible in a very simple way by using the mentioned libraries. First, we indicate to R the link to the Alphacast website that will bring the index in JSON format. Authentication will be completed with the authenticate() function. It should be remembered that the Alphacast API does not need a user and password but works with a single API Key. datasets <- GET("https://api.alphacast.io/datasets", authenticate(user = alphacastapikey, password = "")) To clean up the dataset and make it useful, use the bind_rows() command from the dplyr package in conjunction with the content() function from the httr package to get the response in dataframe format. datasets <- bind_rows(content(datasets))[ ,-5] head(datasets) | id|name |database | |----:|:--------------------------------------------------------|:----------------------------------------------| | 5208|High Frequency CPI - Argentina - Wide - Weekly |Alphacast Basics: Argentina High Frequency CPI | | 5225|High Frequency CPI - Argentina - Weekly |Alphacast Basics: Argentina High Frequency CPI | | 5226|High Frequency CPI - Argentina - SEIDO vs INDEC - Weekly |Alphacast Basics: Argentina High Frequency CPI | | 5231|Public Opinion - Latin America |SEIDO: Latin American Public Opinion | | 5236|Public Opinion - Argentina |SEIDO: Latin American Public Opinion | | 5241|Public Opinion - Argentina - COVID-19 |SEIDO: Latin American Public Opinion | Getting dataframes from Alphacast To obtain a dataframe it is necessary to call the GET function (from the HTTR library) with the number of dataset you want and your API key. For example, if you want to get the data from dataset 6659 (Apple Mobility Report): dataset_id <- 6659 apple_mob <- GET(paste("https://api.alphacast.io/datasets/", datasetid,".csv", sep=""), authenticate(user = alphacastapi_key, password = "")) applemob <- readr::readcsv(content(applemob, as ="text"), guessmax = 100000) head(apple_mob) |Entity |Year | driving| walking| driving - 7drunningav| walking - 7drunningav|transit |transit - 7drunningav | |:-------|:----------|-------:|-------:|-----------------------:|-----------------------:|:-------|:-----------------------| |Albania |2020-01-13 | 100.00| 100.00| NA| NA|NA |NA | |Albania |2020-01-14 | 95.30| 100.68| NA| NA|NA |NA | |Albania |2020-01-15 | 101.43| 98.93| NA| NA|NA |NA | |Albania |2020-01-16 | 97.20| 98.46| NA| NA|NA |NA | |Albania |2020-01-17 | 103.55| 100.85| NA| NA|NA |NA | |Albania |2020-01-18 | 112.67| 100.13| NA| NA|NA |NA | The previous code allows to save the dataframe of the "Apple Mobility Report" in the object "apple_mob". From here, you can do whatever you want with it: graph, analyze, export as csv or JSON, among other things. It is also easy to transform the dataframe to LONG format using the reshape2 package, since all Alphacast datasets contain the "Year" and "Entity" columns. applemoblong <- melt(apple_mob, id.vars = c("Entity", "Year")) Getting repositories and its datasets You can get all available repositories from Alphacast with your level of access. repos <- GET("https://api.alphacast.io/repositories", authenticate(user = alphacastapikey, password = "")) repos <- bind_rows(content(repos)) You can also access the index of the datasets of a given repo. In this case, you can get all the datasets from the repo "Argentina's daily financial data" through the following functions: repo_id <- 21 reposdatasets <- GET("https://api.alphacast.io/datasets", query = list(repoid = repo_id), authenticate(user = alphacastapikey, password = "")) reposdatasets <- bindrows(content(repos_datasets)) head(repos_datasets) | id|name |createdAt |updatedAt | repositoryId| |----:|:------------------------------------------------------------|:-------------------|:-------------------|------------:| | 5266|Base FCI - Renta Variable |2020-10-22T22:38:21 |2020-10-22T22:38:21 | 21| | 5273|Base FCI - Renta Fija |2020-10-27T16:43:04 |2020-10-27T16:43:04 | 21| | 5288|Financial - Argentina - FX premiums - Daily |2020-11-01T17:32:02 |2020-11-01T17:32:02 | 21| | 5289|Financial - Argentina - FX premiums - Daily_Long |2020-11-01T17:33:03 |2020-11-01T17:33:03 | 21| | 5341|Financial - Argentina - Sovereign Bonds |2020-11-12T12:30:03 |2020-11-12T12:30:03 | 21| | 5357|Financial - Argentina - Sovereign Bonds - Last Price - Daily |2020-11-19T16:30:03 |2020-11-19T16:30:03 | 21| Creating repositories in Alphacast You can create your own repository to later upload the dataset. First, you have to set some variables in your R Environment. url <- "https://api.alphacast.io/repositories"` form <- list( "name" = "Repo's Name", "description" = "Test Repo - description", "privacy" = "Private", "slug" = "test-rrr-repo") And then, you post in the Alphacast server through the function POST. r <- POST(url = url, body = form, config = authenticate(user = alphacastapikey, password = "")) content(r) | id|name |description |privacy |slug | |---:|:-----------|:-----------------------|:-------|:-------------| | 610|Repo's Name |Test Repo - description |Private |test-rrr-repo | In this way, the "610" repo is created and can be checked from your admin on the Alphacast web. Uploading data to your repo Once the repo is created, it is necessary to create the slot for the dataset that you want to upload. The system will automatically generate the id of the dataset. url <- "https://api.alphacast.io/datasets" form <- list( "name" = "test_datasets", "repositoryId" = 610)` r <- POST(url = url, body = form, config = authenticate(user = alphacastapikey, password = "")) content(r) $id 6822 In this example, id number 6822 was assigned for the dataset. The next thing is to create the PUT function to upload the CSV to Alphacast and make it appear in the repo. dataset_id <- 6822 url <- paste("https://api.alphacast.io/datasets/", dataset_id, "/data?deleteMissingFromDB=True&onConflictUpdateDB=True", sep = "") Finally, you can upload the dataset in CSV format, with columns named Entity (for countries) and Year (for dates, in YYYY-MM-DD format). In this case, the "tcn.csv" file is uploaded from the indicated path (located in the root folder of the R project). r <- PUT(url, body = list(data= upload_file("tcn.csv")), config = authenticate(user = alphacastapikey, password = "")) content(r) | id|status |createdAt | datasetId| |---:|:---------|:--------------------------|---------:| | 614|Requested |2021-07-26T20:59:21.494134 | 6822| And in this way the file is uploaded to its own repository, generating the possibility of sharing it, transforming it or graphing it. Previous Next

    See more

    Pipelines

    PipelineLast EditedEdit
    New Chart4 days ago
    New Chart4 days ago
    New Chart4 days ago
    See more