YouTube Switched To Amazon's Recommendation Algorithm Based On Item-To-Item Collaborative Filtering, Revealed Google's RecSys 2010 Paper

In a paper at the recent RecSys 2010 conference, "The YouTube Video Recommendation System" (ACM), the most interesting disclosure in the paper "Google has switched the algorithm they use for YouTube's recommendation engine from their own to a variation of Amazon's algorithm that was designed in the late 90s."Here're excerpt from the Google's RecSys 2010 […]

In a paper at the recent RecSys 2010 conference, "The YouTube Video Recommendation System" (ACM), the most interesting disclosure in the paper "Google has switched the algorithm they use for YouTube's recommendation engine from their own to a variation of Amazon's algorithm that was designed in the late 90s."

Here're excerpt from the Google's RecSys 2010 paper:

Recommending interesting and personally relevant videos to [YouTube] users [is] a unique challenge: Videos as they are uploaded by users often have no or very poor metadata. The video corpus size is roughly on the same order of magnitude as the number of active users. Furthermore, videos on YouTube are mostly short form (under 10 minutes in length). User interactions are thus relatively short and noisy ... [unlike] Net?ix or Amazon where renting a movie or purchasing an item are very clear declarations of intent. In addition, many of the interesting videos on YouTube have a short life cycle going from upload to viral in the order of days requiring constant freshness of recommendation.

To compute personalized recommendations we combine the related videos association rules with a user's personal activity on the site: This can include both videos that were watched (potentially beyond a certain threshold), as well as videos that were explicitly favorited, "liked", rated, or added to playlists ... Recommendations ... [are the] related videos ... for each video ... [the user has watched or liked after they are] ranked by ... video quality ... user's unique taste and preferences ... [and filtered] to further increase diversity.

To evaluate recommendation quality we use a combination of di?erent metrics. The primary metrics we consider include click through rate (CTR), long CTR (only counting clicks that led to watches of a substantial fraction of the video), session length, time until ?rst long watch, and recommendation coverage (the fraction of logged in users with recommendations). We use these metrics to both track performance of the system at an ongoing basis as well as for evaluating system changes on live trac.

Recommendations account for about 60% of all video clicks from the home page ... Co-visitation based recommendation performs at 207% of the baseline Most Viewed page ... [and more than 207% better than] Top Favorited and Top Rated [videos].

[tags]recommedation engine,algorithm[/tags]

[Source]