In the News
Comparing Artificial Artists
Last week, German scientists published an interesting article introducing a Deep Neural Network that creates artistic images of high perceptual quality. The system uses neural representations to separate and recombine content and style of arbitrary images, providing a neural algorithm for the creation of artistic images.
This has led many coders to try and reproduce the results. Code examples have sprung up here and here
Teaching tomorrow
Sebastian Thrun, the pioneer of Google’s autonomous cars wants to teach people how to face the future
Doing Data Science at Twitter
Reflections by a Twitter employee on Data Science and Machine Learning at Twitter.
Learning
Understanding LSTM Networks
Long short-term memory (LSTM) is a type recurrent neural network architecture. It is well-suited to learn from experience to classify, process and predict time series when there are very long time lags of unknown size between important events. This is great walk-through.
How Etsy uses Thermodynamics to help you search for “Geeky”
Users on Etsy sometimes enter very broad search queries such as "Geeky" or "Original". Such queries need to be treated in a special way. In this post, an Etsy employee details how they developed and iterated on a heuristic for classifying queries as being broad.
Math for ML course
Linear Algebra and Calculus for Machine Learning. This is a math course useful for those who want a refresher or a detailed course on the mathematical notions underlying ML
Software tools & code
Caffe2
Here is v2 of Caffe, a popular deep learning library for image processing. This new version is still in development but is an interesting attempt at refactoring Caffe, making it easier to use and adapting it for other Machine Learning tasks than vision. To be followed
Mocha.jl - Deep Learning for Julia
Mocha.jl is a deep learning library for Julia, a new programming language designed specifically for scientific and numerical computing, with features such as type inference and multiple dispatch.
Hardware
Brain-Inspired chip can perform 46 billion synaptic operations per second
IBM researchers have been working on building a chip since 2008 that works like the neurons inside your brain. And they’ve just announced an exciting breakthrough. Scientists have developed a system that is made up of 48 million artificial nerve cells, which is about what you'd find in the brain of a small rodent.
Some thoughts
10 Deep Learning startups from Indian founders to watch out for
Applications cover computer vision, marketing, search, news and more
About
This newsletter is a weekly collection of AI news and resources.
If you find this newsletter worthwhile, please forward to your friends and colleagues, or share on your favorite network!
Share on Twitter · Share on Linkedin · Share on Google+
Suggestions or comments are more than welcome, just reply to this email. Thanks!