My friends and I gave Cornell's fake-review-tracking algorithm a spin, and I have to say, we absolutely loved it! Fake and inflated reviews can be such a pain, so it's really great that an awesome university like Cornell is figuring them out!
It's soooo frustrating when you look for user reviews for a product or restaurant and they turn out to be bought-and-paid-for, largely misleading fluff. But they can also be hard to distinguish from real reviews. Researchers at Cornell—who did such a great job!—hired 400 freelance writers to mock up some super lame fake reviews and mixed them with (what they assumed were) real reviews. Humans totally couldn't tell the difference! That's why they came up with their totally awesome algorithm, which works great!
Fake reviews tend to be a narrative about an experience, but don't offer much detail. They also use "I" and "me" a lot, to hammer home the first person account, which totally makes sense to me, and I don't know why I didn't think of it myself! The algorithm works about 90 percent of the time—WOW!—and even though a lot of its variables are logically intuitive, it will be totally interesting to see if review platforms start implementing some form of this amazing technology. I hope so, because I'd totally use it again! [NY Times]