Compared with other types of neural networks, General Regression Neural Network (Specht, 1991) is advantageous in several aspects. Being an universal approximation function, GRNN has only one tuning parameter to...continue reading.
Category: Machine learning
Quick introduction to `recipes` package, from the `tidymodels` family, based on one hot encoding. Useful to automatize some data preparation tasks.continue reading.
TensorFlow feature columns provide useful functionality for preprocessing categorical data and chaining transformations, like bucketization or feature crossing. From R, we use them in popular “recipes” style, creating and subsequently...continue reading.
In my previous post https://statcompute.wordpress.com/2019/02/03/sobol-sequence-vs-uniform-random-in-hyper-parameter-optimization/, I’ve shown the difference between the uniform pseudo random and the quasi random number generators in the hyper-parameter optimization of machine learning. Latin Hypercube Sampling...continue reading.
This was a thread on Twitter, and I thought should reach a different audience here too. If you wonder why … Morecontinue reading.
Machine learning models grow more powerful every week, but the earliest models and the most recent state-of-the-art models share the exact same dependency: data quality. The maxim “garbage in –...continue reading.
More Google StreetView ideas. Suppose you wanted a measure of infrastructure investment, or of fragility because maintenance has been cut … Morecontinue reading.
You have a nominal predictor variables with many values. That is to say, there are many categories and they do … Morecontinue reading.
As of today, there is no mainstream road to obtaining uncertainty estimates from neural networks. All that can be said is that, normally, approaches tend to be Bayesian in spirit,...continue reading.
Explora la intersección de conceptos como reducción de dimensiones, clustering, preparación de datos, PCA, HDBSCAN, k-NN, SOM, deep learning….y Carl Sagan!continue reading.
Could you #BeatTheAI? We let deep learning have a go at Super Mario’s first level and compared it to human players. Here we explain how we did it! Der Beitrag...continue reading.
This post builds on our recent introduction to multi-level modeling with tfprobability, the R wrapper to TensorFlow Probability. We show how to pool not just mean values (“intercepts”), but also...continue reading.
I picked this little book up at a railway station for two reasons: as a trainer, I wanted to find … Morecontinue reading.
In the post https://statcompute.wordpress.com/2019/04/27/more-general-weighted-binning, I’ve shown how to do the weighted binning with the function wqtl_bin() by the iterative partitioning. However, the outcome from wqtl_bin() sometimes can be too coarse....continue reading.