Let’s say you want to find a Twitter list on a topic like, say, “tech.” Where would you go?
For me there are two sites that cover Twitter lists:
1. Listorious.
2. Tlists.
Now Listorious’ home page uses tags to build a directory. But let’s ignore the directory for a while. I’m digging into search because I’ve seen a couple of different apps coming later this summer that are using search to bring new lists and the entrepreneurs were complaining about the quality of the search results.
Even doing something simple, like searches for “technology” or “tech” or “startups” shows you that the search results suck. Let’s dig in and explore some ways to improve the searches, shall we?
First, what would be a GREAT list? My biases (I’ve looked at 10s of thousands of lists in the past few months):
1. A list that’s complete. Basically, a list with 500 sources on it beats one with 10 by a country mile, especially if all those Twitter accounts are actually on topic.
2. A list that’s popular. Whenever I’ve found a list that’s popular it almost always is popular for a reason. So, a list with 6,000 followers is generally better than one that only has 100 followers. That’s not always true, Mashable’s lists, for instance, suck, but they are very popular, but generally that’s true and even in Mashable’s case there’s a case for why they are so popular and better than “joe smith’s tech lists”.
3. A list that’s created by a credible and authoritative person or brand. So, one created by the New York Times is better than one that’s created by someone you never heard of.
4. A list that’s curated every day and kept clean. Sometimes Twitter accounts turn out to not cover too much the topic being discussed. It’s pretty easy for a human to look at a list and figure out if it’s clean or not, but it’s hard for machines to do that. Generally this is why popular lists are better than non-popular ones. Humans don’t follow lists full of spammers.
There are other, harder ways, to figure out whether a list is good. If I could code, I’d be looking at how many RT’s or favorites a list’s members gets every day and helping new lists get to the top that way (I wish I had that data, because I’d create new lists that are made up of only the most retweeted people.
But, generally speaking, those four things are great ways to tell quality of lists.
So, now, why does the search features on Listorious and Tlists suck so bad?
Let’s go to Listorious and search for “tech.” I’m using “tech” because it’s something I know a lot about and I know what the best lists are.
First five results:
1. TechNews from Tech Introvert. My score? Low quality. Why? From a guy without high authority and credibility scores in industry. Only has 45 sources on it. Only has 3 followers. Why the hell is it #1?
2. GirlsintechLA. Even worse than #1. Only 20 members on it, is hardly authoritative, nor complete, and only has 4 followers.
3. QR-Code by Nonprofi Tech. Getting even worse.
4. Tech News Brands from me. OK, this one is good, although, to tell you the truth, this is NOT even my best list for someone searching for tech. My “most-influential-in-tech” list has more followers and is more credible.
5. tech by Peter Urbanski. Now at least this one is interesting, is from someone who has been in the tech scene a long time, has 453 members, but only 11 followers.
Now, these belong in the result set SOMEWHERE but NOT in the top five. There are far better lists out there. What’s weird is Listorious’ engine mostly NAILS the results if you click on their tag interface (with one exception, because the Mashable team list is so popular it is #1, which shows the downside of using popularity as the only score — that list isn’t even Mashable’s best list). Compare that for “tech” and you’ll see that the lists you’ll find are FAR more authoritative and credible. Why is this? Why does the search interface suck so bad? Keep in mind, tags are great for areas like tech where there’s lots of results, but what if you are searching for something that hasn’t had a lot of tagging behavior, like plumbing? Also, keep in mind that tags can be gamed too. So, they are less resistant to spammers and commercial interests or people with large teams of people who can tag a list with tons of tags.
Let’s go over to TLists and see if they do any better.
One thing you’ll notice first is that Tlists ONLY bets on search. No tagging here.
Let’s search on “Tech” and see what the first results are.
One thing immediately is that Tlists’ system is actually trying to both understand and display what people on each lists are tweeting about. It also shows how many Tweets per day are coming from that list, how many members each list has, and how many followers each list has. That’s a MUCH better display right away. But the relevancy of the lists still sucks, especially when you compare it to Listorious’ tag results. Let’s dig in:
1. Tech & Science. This is a super list, which is a great idea. They bring in the most listed Tweeters on 450 lists. This list isn’t very popular, only has 79 followers, and isn’t made by someone very credible on its face, but when you look into it this list was algorithmically produced and is very good. Much better than Listorious’ search efforts. Someone finding this list would be well served.
2. Tech Journalists by Huffington Post. Very credible source. Not very popular, though, only has 46 followers. Not very complete. 26 members. For comparison, my Tech News People list has 499 members and 3,143 followers. On every measure my list is better than the Huffington Post one. But why is Huffington Post’s result here? Are they paying for a high result? We don’t know.
3. Tech Geeks Directory by Susan Elaine Cooper. OK, this is where the list goes south. Not from someone authoritative. Not very popular (only 35 followers). Not very complete (45 members). Why is this list even NEAR the top of search results? Can anyone tell me?
4. twist-callers by JT Keller. This list doesn’t deserve to be on one about tech. 42 members. 19 followers.
5. Technology News by The Job Guy. Another crappy list. 30 members. 6 followers.
Why does the search engine show so many crappy results? Why can’t anyone build a search engine for Twitter lists that works well? It doesn’t look like they are doing ANY relevancy ranking and, even, makes the engine smell like it’s getting paid off to put certain results at the top. Not good. What do you think? Have you tried any searches? Do your results match mine? (I’ve done a variety of searches on both engines and they all suck like the result for “tech” which SHOULD bring back some awesome lists, like Listorious’ tag results do).