To Top
首页 > 深度学习 > 正文

match for search recommendation(深度学习部分)

标签:match, search, recommend, lihang, sigir18, www18


有两个版本,一个是www18的:https://www.comp.nus.edu.sg/~xiangnan/papers/www18-tutorial-deep-matching.pdf

一个是sigir18的:https://www.comp.nus.edu.sg/~xiangnan/sigir18-deep.pdf

好像都403了。。可以看这个http://www.hangli-hl.com/uploads/3/4/4/6/34465961/wsdm_2019_tutorial.pdf

sigir的这个比较新。。看之

这里讲深度学习部分,传统部分见:https://daiwk.github.io/posts/dl-match-for-search-recommendation-traditional.html

概述

搜索领域的deep match

学习搜索representation

学习搜索match函数

学习query-document的matching matrix

ARC-II

AAAI’16 Convolutional Neural Network Architectures for Matching Natural Language Sentences

MatchPyramid

AAAI’16 Text Matching as Image Recognition

Match-SRNN

IJCAI’16 Match-SRNN: Modeling the Recursive Matching Structure with Spatial RNN

K-NRM

SIGIR’17 End-to-End Neural Ad-hoc Ranking with Kernel Pooling

Conv-KNRM

WSDM’18 Convolutional Neural Networks for So-Matching N-Grams in Ad-hoc Search

使用attention model进行match

EMNLP 2016 A Decomposable Attention Model for Natural Language Inference

推荐的deep match

学习推荐representation

Pure CF models

DeepMF

Deep Matrix Factorization

AutoRec

Autoencoders Meeting CF

CDAE

Collaborative Denoising Autoencoder

CF with side information

DCF

Deep Collaborative Filtering via Marginalized DAE

DUIF

Deep User-Image Feature

ACF

Attentive Collaborative Filtering

CKB

Collaborative Knowledge Base Embeddings

学习推荐match函数

Pure CF models的match学习

基于Neural Collaborative Filtering框架
NeuMF
NNCF
ConvNCF
基于Translation框架
TransRec

要求\(Head + Relation \approx Tail\),也就是说,想让这两个向量尽量是同一个向量,那cos相似度就没啥用了,因为cos只能表示夹角尽量小,可能向量的长度会差很远,所以呢,可以用L1(曼哈顿距离)或者L2距离(欧几里得距离)!!!

LRML

直接用欧氏距离,relation向量是通过attention学到的

Feature-based models的match学习

基于MLP
Wide&Deep
基于FM
Neural FM
Attentional FM
基于树
GB-CENT
DEF
TEM

原创文章,转载请注明出处!
本文链接:http://daiwk.github.io/posts/dl-match-for-search-recommendation.html
上篇: match for search recommendation(传统部分)
下篇: BigGAN

comment here..