08 Mar 2019, Prathyush SP
Hyperparameter optimization can be formulated as a bilevel optimization problem, where the optimal parameters on the training set depend on the hyperparameters. We aim to adapt regularization hyperparameters for neural networks by fitting compact approximations to the best-response function, which maps hyperparameters to optimal weights and biases.
08 Mar 2019, Prathyush SP
Recurrent neural networks are a powerful tool for modeling sequential data, but the dependence of each timestep’s computation on the previous timestep’s output limits parallelism and makes RNNs unwieldy for very long sequences.
https://ai.googleblog.com/2019/03/rnn-based-handwriting-recognition-in.html
08 Mar 2019, Prathyush SP
DeepFashion2 is a comprehensive fashion dataset. It contains 491K diverse images of 13 popular clothing categories from both commercial shopping stores and consumers. It totally has 801K clothing clothing items, where each item in an image is labeled with scale, occlusion, zoom-in, viewpoint, category, style, bounding box, dense landmarks and per-pixel mask.There are also 873K Commercial-Consumer clothes pairs.
The dataset is split into a training set (391K images), a validation set (34k images), and a test set (67k images).
https://github.com/switchablenorms/DeepFashion2
08 Mar 2019, Prathyush SP
State-of-the-art meta reinforcement learning algorithms typically assume the setting of a single agent interacting with its environment in a sequential manner. A negative side-effect of this sequential execution paradigm is that, as the environment becomes more and more challenging, and thus requiring more interaction episodes for the meta-learner, it needs the agent to reason over longer and longer time-scales.
06 Mar 2019, Prathyush SP
Today, we’re excited to announce TensorFlow Privacy (GitHub), an open source library that makes it easier not only for developers to train machine-learning models with privacy, but also for researchers to advance the state of the art in machine learning with strong privacy guarantees.
04 Mar 2019, Prathyush SP
We present an interpretation of Inception modules in convolutional neural networks as being an intermediate step in-between regular convolution and the depthwise separable convolution operation (a depthwise convolution followed by a pointwise convolution). In this light, a depthwise separable convolution can be understood as an Inception module with a maximally large number of towers.
04 Mar 2019, Prathyush SP
We’re releasing a Neural MMO — a massively multiagent game environment that supports numerous populations of agents:
04 Mar 2019, Prathyush SP
Feature visualizations, letting us see through the eyes of the neural network. The hidden layers of neural networks are quite fun to inspect.
03 Mar 2019, Prathyush SP
Different parts of the brain learns in different ways. Model-free, model-based, and memory-based are all just different sides of the same coin, according to Kenji Doya
https://ai.googleblog.com/2019/03/rnn-based-handwriting-recognition-in.html
03 Mar 2019, Prathyush SP
TFX is a Google-production-scale ML platform based on TensorFlow. It provides a configuration framework and shared libraries to integrate common components needed to define, launch, and monitor your ML system.