Alpha Go Zero in depth

A long-standing objective of artificial intelligence is an algorithm that learns superhuman competence by itself. If man succeeds in constructing such a system, then the repercussions of such a discovery will be numerous and diverse and will change our daily life as human beings. However, it is not clear how we could give machines the ability to reason and understand the world around them. As of today, many systems, such as Amazon Alexa, or Google Home act as substitutes, allowing us to blossom with the limits of artificial intelligence, the fruit of an ungrateful training on a huge quantity of collected data.

Sonnet and Attention is All You Need

In this article, I will show you why Sonnet is one of the coolest libraries for Tensorflow, and why everyone should use it

Injecting Stochastic Noise during Dialogue Generation.

Discussion around a new generative model trained to produce meaningful dialogue utterances using a latent variable at the decoder level.

On the path towards scaling Bitcoin to billions of transactions. Part I.

In this new series, I'll go over the recent progress towards making Bitcoin a trusted financial systems vulnerable to monopolies.

From the origin of Generative Modeling to the recent Generative Adversarial Network.

A Generative model is a model for randomly generating observable data values, typically given some hidden parameters.

On recent progress in dialogue generation

At True Ai, we apply state-of-the-art machine learning models to improve our product helping agents in customer support to answer questions faster and better. As I recently started as an intern, I'm currently spending some time reading the literature. Here is a not exhaustive summary of relevant NLP papers you should read.

The Hyperband Algorithm.

In the past month, I've been spending some time on Bayesian optimization. They've become my favorite way to go for hyper parameter search ... it was time-consuming because it required neural training network for some epoch ruled by early stopping, and I realized that most of the configurations where not promising since the beginning.

Image Generation with BEGAN and Skip-Though vector.

In essence, BEGAN uses an auto-encoder as a discriminator. With the auto encoder, BEGAN optimization tries to match the distribution of the errors instead of the usual used samples distribution. For training the auto encoder, they used an L1 loss.

End-to-End Approach for making a Facebook bot using a Sequence to Sequence model.

As a fun school project with two folks, we decided to work on Sequence to Sequence model, and we spent some time working to make a chatbot that will answer at our place on Facebook.

Support Vector Machine, a Gentle Mathematical Introduction

As I was talking with a fellow friend, I had this conversation where he explains how Support Vector Machine had performed really well on its last biological data set compared to a neural network approach.