Sagarin changes formula, finally removes 'Margin of Victory'
Jeff Sagarin, like the other BCS computer programmers, was instructed in 2002 to remove MOV. This week, he finally did.
In the 2001 season of the BCS, half of the eight computers used at that time considered margin of victory in their calculations, and the other half did not. That year, human poll No. 2 Oregon did not make the BCS title game due in large part to the fact that the MOV computers ranked the Ducks seventh, while the non-MOV versions ranked them third.
There were other problems too, but that was the one that got the attention of the BCS poobahs, who then decreed something like, "let margin of victory be stricken from all notebooks and tablets, stricken from all obscure algorithms of college football...so let it be written, so let it be done."
The result is that since 2002, margin of victory has not been a factor in the computer rankings used by the BCS. Two of the computer gurus dropped out instead of adjusting their formulas. Peter Wolfe said he changed his formula to remove MOV as a component. Jeff Sagarin added a new column to his ratings for BCS purposes called "Elo Chess," which he described by saying, "only winning and losing matters; the score margin is of no consequence." Or so we thought.
This week, a new column of data appeared called "Pure Elo." Pure Elo has a familiar description. It says, "only winning and losing matters; the score margin is of no consequence." Hmmm. Where have I seen that before? Meanwhile, "Elo Chess" has been renamed "Elo Score," which he says "applies ELO principles to the actual SCORES of the games and so it is now SCORE BASED. (emphasis his)" Let's be perfectly clear...it was always score based. The name of the heading and the description has been changed to protect the guilty.
UPDATE: Jeff Sagarin wrote to explain the new formulas. He said Pure Elo is "an improved version of Elo Chess" that he was working on in the offseason and uses in other sports. He also says Elo Score is not the same as Elo Chess, but some other version he's been using for a while in other sports. He also made a point to say that, "At no point in the past during the BCS standings period of the season, once the dictum was pronounced in 2002, have I used scores in my official BCS standings report."
The results of the new formulas don't seem to bear that out, at least not yet (Sagarin assures me they eventually will), but that's all we have to go on, so we are left to draw our own confusion.
Where the difference is most obvious is where FCS schools are rated in the new system. Last week, Elo Chess had two FCS teams in the top 60. The top rated among them was Eastern Illinois at 38. Those facts are still true of Elo Score this week, but the Panthers are 24th in Pure Elo, which makes them the lowest rated of the seven FCS schools in the Pure Elo top 25. Bethune-Cookman ranks fourth. Fourth!!
Now, it doesn't matter that Bethune-Cookman is fourth because FCS schools get pulled out and FBS teams slotted up for BCS purposes, but what this illustrates is that the Pure Elo formula is very different than the one that has been used for the last 11 years by the BCS. Meanwhile, Elo Chess/Elo Score looks very similar to what it has always looked like. Only now, Sagarin admits that MOV was a factor in those ratings. Well, not in so many words.
Is this definitive proof? No, my evidence is anecdotal. Only by examining the formulas can we prove that definitively, just like we need the formula to prove that Richard Billingsley's ratings are influenced by the previous season. That is a factor the BCS honchos don't want considered either. I have personally found an error or two in Wes Colley's calculations over the years also, which I can do because thankfully, Colley's formula is open and accountable. That makes three of the six ranking systems that have had issues. That we know of.
The BCS has always had its head buried in the sand regarding the computer formulas. They never validated the results, and never validated that the programs were doing what they were told they were doing. They prefer the Sargeant Shultz approach. "I KNOW NOTHINK!" They just keep telling us they trust the programmers and that they are, "the best in the business," which is another claim they have no way of verifying.
We can only hope they did a better job vetting the new selection committee than they did their computer formulas.
Our Latest Stories
Why would you try this?
Virginia Tech looks like team to beat in Coastal Division
Justin Fuente dug deep in the playbook for this touchdown
Rayshawn Jenkins left the game with a right leg injury on Thursday night
It's Bielema vs. Malzahn, part four
In 2012, A&M traveled to Tuscaloosa and shocked the world... can they do it again this yea...