Watching the vote count at Minneapolis City Hall on monitors in the Rotunda on Wednesday.

The reason the vote count for Minneapolis mayor is going so slowly is that the ordinance the city is relying on is based on faulty logic and arithmetic.

In one place, the ordinance says, “All candidates for whom it is mathematically impossible to be elected must be defeated simultaneously.” Defeated means that they are eliminated and the ballots cast for them are transferred to the voters’ second choice.

In fact, all but three candidates  — Betsy Hodges, Mark Andrew and Don Samuels — cannot win, mathematically.  That’s because the sum of ALL their first-, second- and third-place votes is less than Betsy Hodges’ first-place votes.  So adhering to that statement in the ordinance, they could have eliminated 32 candidates at once, and reallocated their votes to the voters’ second-place choices.  If they did that, they would have announced a winner within an hour.

But elsewhere in the ordinance, the drafters, in their wisdom, defined “mathematically impossible to be elected” to mean something other than whether the candidate could possibly win. They defined it this way:

Mathematically impossible to be elected means either:

(1) The candidate could never win because his or her current vote total plus all votes that could possibly be transferred to him or her in future rounds (from candidates with fewer votes, tied candidates, surplus votes, and from undeclared write-in candidates) would not be enough to equal or surpass the candidate with the next higher current vote total; or

(2) The candidate has a lower current vote total than a candidate who is described by (1).

Using that definition, the city has chosen to eliminate the candidates one at a time, making for a multi-day process.

However, this definition has nothing to do with “mathematically impossible to be elected.”  It’s a definition of mathematically impossible to move up one notch.

Oh, well.

Join the Conversation

11 Comments

  1. OMG,…

    The proponents of this waterboarding of the English language should have their names published here !!

    The worst thing about this tomfoolery is that we’re not getting the essential function of a primary out of round #1. If it were being done properly, the 35 candidates would be reduced to 3 in one step – very much as if a primary had been conducted, with 3 survivors moving on to the general election.

    This is one of the purposes of RCV – to eliminate the primary where only a small proportion of voters “frames” the general election for everyone else. With RCV, all the voters of the general election are present to conduct a virtual primary + general election in one election.

    32 rounds of tabulation to eliminate 32 who are ALREADY DEFEATED as of round 1 ? One can only hope this interpretation will be changed without controversy, but who knows ?

  2. Minneapolis vote-count process based on faulty logic and bad ari

    I think you are wrong for the following reason illustrated with this example: If the reallocation of the votes from the “losers” below the current 4th place candidate, Winton, caused him to vault ahead of Samuels, then Samuels’ 2nd place choices will be allocated before Winton’s. If Samuels and Winton’s voters have different preferences Andrew vs Hodges, that could be decisive. Similar examples could be made with other of the losers.

    I realize that my example ignores the very high likilhood that in actual fact, Andrew is too far behind Hodges for the order of the others close to him (Samuels, Winton, Cherryhomes …) to matter, but I think this is the idea.

    I do know that there are a lot of counterintuitive possibilities and election paradoxes possible depending on exactly which voters rank which candidates.

    1. No tabulation scenario can overcome defeat.

      The sum of #1,#2, and #3 preferential votes per candidate represents the HIGHEST POSSIBLE vote total that each candidate could achieve in the election.

      It is impossible that Cam Winton’s 15,236 total of all preferential votes (#1 + #2 + #3) will ever get bigger than Mark Andrew’s #1 count alone of 19,584. It can never be. Therefore, Cam Winton is defeated.

      There are only 3 candidates who are not defeated in round #1 – Hodges, Andrew, Samuels. The other 32 are all defeated, in every meaningful sense of the word.

      Candidates whose summed 1st + 2nd + 3rd preference votes are lower than the single #1 preference votes of the #1 vote leader, based on a comparison made after round #1 of tabulation, cannot possibly win the election – it is mathematically impossible.

      1. 1/2 the point

        You are correct that Cam, for example, is unable to win. But you’re ignoring the point that the sequence of allocating 2nd &3rd choices can make a difference in the outcome, particularly in a closer race.

        1. Quite to the contrary, no sequence of allocation can…

          …turn a defeat into a victory.

          This is the confusion I sought to dispel.

          It might change the order in which defeated candidates are defeated, but it won’t change the winner.

        2. I’ll concede your point for candidates who…

          …are not mathematically eliminated at the outset.

  3. Why the rush?

    What, exactly, is detrimental when the vote count takes several days? We are not a parliamentary system where the majority takes over immediately. All winners take their seats in January. The eventual winner will need to make decisions concerning personnel. However, either the winning candidate knows those personnel choices already or will take several days to rest up and then begin the vetting process. The answer to “Why the rush” is that the news media give the false impression that it is essential to have a quick count. The truth is that the news media inherently always want quick answers because that is the model of their business.

    1. What’s the benefit?

      Why shouldn’t we speed things up? If it’s immediately clear that 90% of the candidates are also rans, why shouldn’t we get them out of the way quickly? There’s no point in having a dragged out process. It’s not improving accuracy, is it?
      I’ve yet to hear any good reason for why this wasn’t all completed Tuesday night.

  4. Once again…

    A decent idea botched by Minneapolis leadership. The surprising thing is that so many are surprised by this news.

  5. Ken Williams

    Hi Joel,

    I agree that the specification is ambiguous at best. But your analysis also is incorrect.

    Because the process, whatever it is, is guaranteed to always produce a winner, we know that it is “mathematically impossible” (in the general, not statutory, sense of the phrase) for anyone but a single candidate to win, and that single candidate is hiding among the numbers the moment the polls close. So by your logic, the first round should actually eliminate all but one candidate and the winner should be declared in the first round.

    What you’ve done is show one specific criterion by which we can quickly determine that a large group of candidates are known not to be that winner. But there are of course other additional ways to eliminate besides that one criterion, because we will eventually do just that and determine the winner.

    Therefore it is inappropriate for the statute to use the phrase “mathematically eliminated” without defining the precise rules for elimination, as they have done (or at least tried to do – I still find it ambiguous). Personally I would much rather they used a different phrase, like just “eliminated” and then defined elimination.

    The only way your analysis could be saved is if you could prove that the order candidates are eliminated in has no effect on the naming of the eventual winner. Perhaps that’s true, I don’t know, but it’s much harder to show.

    -Ken

Leave a comment