Dr. Ben Carson, who was and may still be one of the leading candidates for the presidential nomination of the party that gave us Abraham Lincoln, has now simultaneously affirmed and repudiated his controversial Sunday comment that he would be opposed to a Muslim becoming president.
He told Sean Hannity that he stood by his statement because (although he didn’t say so Sunday) when he said he would be opposed to a Muslim as president, he meant to refer only to a Muslim who was proposing to impose sharia law on the United States. (Put me down as opposed to that also, although I don’t think a candidate would get too far with the sharia platform.) Carson said he would feel the same about a Christian who was proposing to impose a Christian theocracy on America.
If that’s what he was trying to say on “Meet the Press,” he failed to make it clear at the time. What he said on “Meet the Press” made no reference to the dangers of a Christian theocracy, nor even an Islamic theocracy. What he said was: “I would not advocate that we put a Muslim in charge of this nation. I absolutely would not agree with that.”
It’s a little soon to be sure, but the kerfuffle seems to have generated new enthusiasm for Carson among Republicans.
How we got here
The 2015-16 race for president is off to such a strange, unprecedented and, dare one say, absurd start that it started me thinking about how we got here and how what we have now might look to earlier generations.
In the first half of U.S. political history, presidential aspirants did not declare their candidacies, nor did they campaign in the modern meaning of the term. They made no public appearances, issued no statements of their policy positions and were basically expected to humbly evince no interest in whether they would be chosen to serve.
For about 40 years after the U.S. Constitution was ratified, members of Congress held a caucus to choose the candidates of their parties. No opportunity for ordinary citizens to participate. Just the congressional caucuses.
In many states, in the early days, there were also no popular elections. So the state legislatures (as mandated by the Constitution) could and did choose the Electoral College electors who would cast the only meaningful votes.
(In the first four elections, the electors would each vote for two men, without specifying that one of them was for president and the other for vice president. Whoever was named on a majority of ballots would become president and whoever finished second would be vice president. All this was in accord with the now-unfamiliar logic of the Constitution’s framers.)
The framers, operating at a time when there was no national media and no national party structures, made this whole process up based on nothing that had ever existed. The job of a presidential elector — unlike today when it is merely ceremonial — was to make at least the first cut at who might be president. The framers probably expected that in many elections (after George Washington passed from the scene) no one candidate would get a majority of electoral votes and the choice of president would be made in the U.S. House on a one-state, one-vote basis. That, by the way, is still in the Constitution and would be used if a presidential election was thrown into the House. But only two elections have been decided in the House and none since 1824.
Enter the two-party system
During the 1830s, the two-party system became reasonably well-established, and by 1840 the process of choosing the party nominees had been transferred from members of Congress to party conventions, to which each state sent delegates to choose the party tickets. But the general public still had no explicit role until after the party insiders had made their nominations. There was nothing resembling primaries to give ordinary Americans a say in who would be the nominees in the fall.
By then (the first half of the 19th century) most (but not all) states held popular elections in which voters could choose between the major party nominees. Voting in the early days was generally limited to white male adult property owners, so actually most Americans couldn’t vote. The property requirement was eliminated in 1828 and women got the vote in 1920, but there has still has never been a presidential election in which more than 50 percent of the population actually voted.
A few states were slow to hold popular elections at all. South Carolina, one of the original 13 states and the last holdout, never conducted a popular election for president until after the Civil War. The Constitution still does not explicitly require states to hold popular elections; it requires only that each state’s legislature, by whatever method it preferred, choose electors (although it would certainly cause a stir if any state decided to skip the electorate in choosing its electors).
Behind the scenes, ambitious politicians did what they could to advance their chances, but from the time of George Washington until well past the time of Lincoln, it was considered unseemly to act like you wanted the office and immodest to act like you felt qualified for it. There were no debates and nothing that we would consider campaigning by the actual candidates. There were also no primaries, so the process of becoming the nominee of the parties was a matter handled almost entirely within the small minority who were likely to be delegates to the national nominating conventions.
Lincoln, in keeping with tradition even after he was nominated by the 1860 Republican convention, stayed home in Springfield, Ill., and did not make a single campaign speech.
By the time of, let’s say, William Jennings Bryan, the Dem nominee in 1896 who seems to be the breakthrough figure in this regard, it became possible for candidates to travel by train, in what was called a “whistle-stop” campaign, airing his views to crowds around the country. Republican 1896 nominee William McKinley stayed home in Canton, Ohio, but spoke from his front porch to a large number of crowds who were brought in to hear and see him. He is said to have addressed up to 750,000 visitors this way in what was dubbed the “front porch” campaign.
Note here that McKinley won by a solid margin. But Bryan won in the sense that he broke the barrier to direct campaigning by presidential nominees, and it stayed broken. Try to imagine a candidate today who declined to travel the country campaigning.
Presidential primaries take off
The next big leap toward the modern way is the rise of the presidential primary to allow voters a say in the choice of their party’s nominee. Primaries got going in earnest in 1912, the year former President Theodore Roosevelt attempted a comeback against his own chosen successor, William Howard Taft. Roosevelt won more primary votes and more primaries and more primary-chosen delegates, pretty much establishing that he was the choice of the Republican electorate. But most states didn’t have primaries and Taft, with the powers of an incumbent president, won the nomination (although after Roosevelt formed his own insurgent third-party, the election went to Woodrow Wilson — who, by the way, was nominated by the Democratic Convention on the 46th ballot).
The primary continued to spread to more states but there were many states that still chose their delegates without much input from the general public. In many cases, the primaries were “winner-take-all,” which meant that it did candidates no good to participate in states where they didn’t believe they could finish first.
It was possible, even common, for someone to win the most primary votes and the most delegates from states that had primaries and still lose the nomination. In 1952 on the Democratic side, Tennessee Sen. Estes Kefauver won 64 percent of all votes cast in the 16 states that held primaries and still lost the nomination to Adlai Stevenson, who had won less than 2 percent of all primary votes.
(1952 was also the first year for televised campaign advertising, although most U.S. households didn’t yet have TV sets. If you want a laugh, this link will get you the first-ever such ads, for Dwight D. Eisenhower and Adlai Stevenson. But beware, you may spend a chunk of your day watching old political TV ads.)
For much of the mid-20th century (until 1972), the nominating system was a mixture of primaries and other means of building delegate strength. Many serious candidates did not compete in primaries or, more commonly, would enter a select few in order to supposedly demonstrate their appeal to ordinary voters to strengthen their argument for the nomination.
This system also had a wrinkle that makes little sense from today’s perspective: a local political figure who was not really running for president would run in his state’s primary and win, and therefore lead the state’s delegation to the convention and therefore be a serious player in determining the nominee.
In 1960, for example, Sen. John F. Kennedy entered and won almost all the 16 primaries held. But in Ohio, Gov. Michael DiSalle ran as the “favorite son” candidate, which kept everyone else off the primary ballot so that DiSalle won 100 percent of the vote and was in a position to negotiate with the actual presidential candidates and award Ohio’s delegates to the candidate of his choice.
McGovern-Fraser rule changes
After the violent and disastrous Democratic convention of 1968, the Dems convened a commission to propose rule changes. (Minnesota’s Don Fraser was one of the chairs and the commission is generally known as “McGovern-Fraser.”) Directly or indirectly, the commission’s rule changes led states having primaries or caucuses to make public participation more possible and did away with the “winner-take-all” rules, which meant that a serious candidate has to compete in pretty much every state and can benefit from doing so even if he or she doesn’t finish first.
The impact of McGovern-Fraser (and most of the rule changes soon spread to Republican nomination campaigns as well) finally put in place the system that has since become familiar, starting with the 1972 nominating cycle. 1972 was the first year that the Iowa caucuses became the kickoff event (replacing the New Hampshire primary).
Before 1972, candidates would often wait a while to get into the race. But the new system almost requires that you start accumulating delegates in Iowa and New Hampshire. In fact, the race for money, for staff, etc., pretty much requires that candidates make their intention to run for president clear almost two years before the election. In the old days it would have been considered unseemly for a candidate to start campaigning too early. In the 1960 campaign cycle, for example, John F. Kennedy didn’t announce his candidacy until January of 1960 (and won the New Hampshire primary two months later). In other words, the date on which JFK declared his candidacy was months closer to Election Day than we are today — with 20 candidates in the race and many of them running for six months or more already.
Like JFK, Barry Goldwater, the Republican nominee in 1964, announced in January of 1964, and the same for Richard Nixon in 1968, except he announced in February of that year, just a month before the New Hampshire primary (which he nonetheless won).
The process has gotten longer, stranger, more expensive, more dominated by TV ads, more democratic (compared to the earliest days), more negative, more poll-driven and the parties continue to tinker with the structure, often tweaking it between cycles to deal with issues that have arisen in the most recent one.
The New York Times last weekend analyzed the new rules that Republicans introduced ahead of this year’s race, to fiddle with the system for choosing delegates, to try to shorten the intra-party portion of the race and to try to protect the eventual nominee from having to move so far to the right to win the nomination that it will be difficult to tack back to the center to the win general.
But, the Times surmised based on the race so far, “as the sprawling class of 2016 Republican presidential candidates tumbled out of their chaotic second debate last week, it was increasingly clear that those rule changes — from limiting the number of debates to adjusting how delegates are allocated — had failed to bring to the nominating process the order and speed that party leaders had craved.”
Back to the drawing board.