Child pages
  • Deep Learning & Graph Embedding Seminar (2015 Fall)
Skip to end of metadata
Go to start of metadata

Every Friday 1-2 pm at 366 WVH.

#

Date

Paper

Presenter

Slides

1

09/24

Distributed Representations of Words and Phrases and their Compositionality

Further references:

Neural Word Embedding as Implicit Matrix Factorization

Want to learn more about Noise-Contrastive Estimation?:

Noise-Contrastive Estimation of Unnormalized Statistical Models, with Applications to Natural Image Statistics

A fast and simple algorithm for training neural probabilistic language models

Ting

keynote

2

10/02Continue last week's presentationTing 

3

10/09Paper deadline  

4

10/16Paper deadline  

5

10/23

LINE: Large-scale Information Network Embedding

Other references:

PTE: Predictive Text Embedding through Large-scale Heterogeneous Text Networks

DeepWalk: Online Learning of Social Representations

Yuan 

6

10/30

Backpropagation

Heterogeneous Network Embedding via Deep Architectures

YupengHere

7

11/06

Convolutional Neural Networks

A Convolutional Neural Network for Modelling Sentences

Rui Dong

Convolutional Neural Network.pdf

Supplement of CNN for modeling sentence

8

11/13

Recurrent Neural Networks:

More RNN in DLw4NLP class

The Unreasonable Effectiveness of Recurrent Neural Networks

Computation graph and automatic differentiation:

Programming Models for Deep Learning

Automatic Differentiation: The most criminally underused tool in the potential machine learning toolbox?

Ting

Computation graph model

RNN-from-Richard.pdf

9

11/20

Hinton's online NN class

http://colah.github.io/posts/2015-08-Understanding-LSTMs/

Awesome RNN collection

Applications:

Sentiment analysis

Sequence to Sequence Learning with Neural Networks

A neural conversation model

sentence skip-thought vector

Ting

RNN.pptx

10

11/27Thanksgiving  

11

12/04

Restricted Boltzmann Machines

An introduction to RBM

RBM training

Deep Boltzmann Machines

DBM

Efficient learning for DBM

 

YupengHere

12

12/11

Autoencoder

Semi-supervised Learning of Compact Document Representations with Deep Networks

Theory part

Extracting and Composing Robust Features with Denoising Autoencoders

Stacked Denoising Autoencoders: Learning Useful Representations in a Deep Network with a Local Denoising Criterion

Application part

Image Denoising and Inpainting with Deep Neural Networks

Building High-level Features Using Large Scale Unsupervised Learning

Collaborative Deep Learning for Recommender Systems

Yuanautoencoder_Yuan.pdf

 

    
  • No labels