Temporal Data & the Rankings Rollercoaster
Wednesday, May 23, 2007 3:47:09 PM
Posted by randfish
I noticed an engaging blog post from Stoney DeGeyter over at SearchEngineLand this evening (or rather, morning/afternoon for our American & European readers) - The Ranking Roller Coaster Cause & Effect. It's definitely worth a read, but I also wanted to point out one specific area that we see causing the "rollercoaster" effect all the time - temporal data.
Temporal data for a search engine can include:
- When content was first spidered
- When a new link was first discovered
- Time frames for influxes of links
- Time frames for large amounts of content on a specific subject
The engines can use this data in all sorts of ways (everything from knowing what to put in the "news" results to determining potential spammers), but it really affects the rollercoaster ups and downs of rankings, too. Stoney mentions three things that can cause the coaster:
- Changes you make on your site
- Changes to search engine algorithms
- Changes made by your competitors
I'd add temporal fluctuations as a critical fourth. In a way, this falls under "changes to search engine algorithms," but the algos aren't really changing, they're just absorbing new data in the ways they always have. What we usually see is that Google and MSN, and Yahoo! to a slightly lesser extent, give priority to new documents on trusted sites and to even small clusters of inbound link influxes. Thus, the following scenarios happen quite a bit:
- You're ranking great, when all of a sudden, a Flickr page or a Technorati tag page or a page at Wikipedia overtakes you. The page is new, has little to no external inbounds, and you're flumoxed by how it can rank well. Don't worry, amigo - that's almost certainly the fresh boost, and it tends to die out after 5-10 days at most.
- You're ranking in 10th or 20th place behind some heavy hitters, but your domain is pretty tough and all of a sudden, 5-10 new links point your way. Voila! You're at the top of the results, ranking in front of pages you were sure you'd never overtake this quickly. Once again, it's fresh boost, giving a little bit of "extra credit" to your newfound popularity. I liken this to the search engines almost making the assumption that "whoa! this page got a lot of link love quickly, it must be super relevant/popular for this query, let's give it some juice." The problem is when the engines don't find lots more new links, you start falling down in the results fairly rapidly. Should we call that the "stale drop"?
- Rollercoaster mania hits - you're trading places atop the SERPs with 2-3 pages almost every day. I almost want to call this the "tie" flux - new links, and possibly refreshed content on your page and your competitors is making it a really tough call for 1st place, so the tiniest of changes can bump you ahead or leave you behind.
All in all, I like Stoney's post, particularly for his last few lines of advice:
Almost every site owner will, at one time or another, find themselves face to face with significant ranking drops. Panicking should be the last thing that you do. Sometimes the best course of action is nothing, however you can never go wrong with a bit of research.
Many people, when seeing sudden drops in rankings, make drastic changes in their website in order to compensate. For the most part, this is a bad move. The first thing you need to do is to research the issue, identify what (as much as can be determined) caused the problem and then carefully plan out a course of action, if any, which needs to be taken.
However, make sure you're thinking carefully about temporal data the engines use and how it might be impacting your rankings/results.
If You Could Ask the Search Engines Any Question and Get An Honest, Complete Answer… What Would it Be?
Posted
by randfishYears ago, the world of SEO was filled with mystery and intrigue - vague half-answers from the engines combined with private forum and backroom conspiracies to make for an industry rife with misinformation. Today, those problems still exist in some forms, but the engines are far more forthright and the mysteries of the ranking algorithms are no longer shrouded in dubious half-truths. Sure, we may not know everything in them, and we may not have the right balance, but by and large, search marketers can read through a document like the search ranking factors and feel fairly confident in their wisdom (if not, neccessarily, their abilities). As the old joke goes - "I stole the Google algo!" "What does it say?" "We need links!"
However, all that aside, there are still plenty of straightforward, honest answers to questions, both simple and complex, we'd all love to get from the engines. I'll share a few of mine, and hopefully you can fill in lots of your own in the comments. If we're lucky, one day in the future, these may all have answers.
A few of my questions (in no particular order):
Does a link from a page with meta robots="noindex, follow" carry less weight? no weight?
What role do search quality raters play in determining rankings?
Does your engine ever use the predictive abilities of search keyword demands to profit outside the world of search?
Some domains move effortlessly to new domain names without a loss in rankings, while the vast majority go into the "sandbox" to languish for many weeks or months - what are the factors affecting the decisions to "trust" some domain moves while "distrusting" others?
How much impact do the other domains owned by / registered by a site owner have on the way a site is viewed/treated algorithmically?
What is the purpose/motivation behind obsfucating accurate, precise keyword usage data? (why not simply charge for it?)
What is the purpose/motivation behind obsfucating accurate, precise link data? (why not simply charge for it?)
Do better webmaster relations have a direct, positive impact on earnings?
Why don't you (mostly Google, Ask and, to a lesser degree MSN) refrain from building/owning content portals that could deliver traffic and revenue?
Google - is your share price overvalued?
Do companies/sites that spend a lot with your engine receive any SEO benefits (free consulting time, a few tricks from an engineer, etc)?
Do you use any of the following - latent semantic indexing, keyword density, term vectors, term weighting?
How do you detect cloaking? No, really?
In less than 100 words, describe why you choose to rank Wikipedia above accurate sources?
If I were Matt Cutts or Tim Mayer or Eytan Seidman or Kaushal Kurapati... I would probably answer - 1, 5, 7, 8 and 9. Seriously, though, I'd really love to get your questions below. Next time I sit down with these guys, I'll put their feet to the fire (or, in my case, a slightly warm pebble).
p.s. I promise - lots of fantastic stuff on China to come ASAP. And if anyone from Google China in Beijing is reading, please drop me a line (rand_at_seomoz.org) - we'd love to see you while we're here :)Do you like this post? Yes No
New Page Strength Tool Feature: Refresh Report
Posted by Oatmeal
I've added a feature to the Page Strength tool that allows you to do a hard refresh of the data in your report. If you run a report and data is missing try refreshing it and it'll look for factors that were missing and attempt to fetch them again. Please keep the following in mind: Refreshi
ng will only re-fetch data that is missing, not data that is inaccurate. If our tool is reporting 3,000 backlinks but clicking the Yahoo site explorer link reports 4,000 - the tool will not attempt to fetch this data again if you issue a refresh request. Also, don't assume our numbers are inaccurate because they don't match what site explorer says. Yahoo's numbers go up and down constantly, so what our tool sees and what you see when viewing through your browser may differ from day to day. Don't go refresh crazy. I avoided adding this feature for a long time because I was concerned it would consume too many resources on our server and bring the overall quality of reporting down. We've added an extra server today, however, and I'm hoping the extra hardware will allow me to keep this feature open to the public. Please be courteous and don't refresh every report you run a bunch of times, I'm going to keep this feature open to the public as
long as our server can handle it. If it becomes too mu!
ch I'm g
oing to restrict it to premium members only. If you haven't read the Page Strength FAQ, please do. It'll save me from having to answer a few hundred emails. Also, The Keyword Difficulty Tool is open to the public again. I had to temporarily restrict the usage of this tool to premium members because it was too taxing on our server.
more






