Feature Extraction Revisited

A visit to pandora.com prompted me to revisit the topic of feature extraction. Tim Westergren’s Music Genome Project is probably one of the coolest ways of exploring feature extraction and relevance feedback:

  • The feature extraction part extracts the “phenotypes” from a piece of music you like and uses those features to find similar tunes.
  • The relevance feedback part uses your input (thumbs up/down) to refine the search.

So starting from “Jimmy Smith” and after a few course corrections the suggestions (e.g., Lou Donaldson’s Funky Mama) started to sound like what I was after.It’s great to see feature extraction and relevance feedback demonstrated in such an intuitive way. It’s also great to see that the Music Genome Project got it right. Others are still having problems employing these technologies right. For example, Amazon’s recommendations insist on recommending based on items that I bought but not for myself. I bet they’ll get more mileage (read sales) if their recommendation algorithms would discriminate between an item’s intended recipient and the person buying it. Are you listening?

2 thoughts on “Feature Extraction Revisited

  1. Pingback: Services Without Borders — micro-workflow.com

  2. Pingback: Asleep at the Wheel — micro-workflow.com

Leave a Reply