Sedition at the US Capitol

After the events at the US Capitol on January 6, dozens of those who invaded the Capitol have been arrested and charged, and dozens more will soon follow. Since the details of the invasion have been plastered across the headlines for the past week, there is no need to go into specifics here. The story is now turning to just how these individuals will be punished. So far, most charges involve illegally occupying US property, and weapons charges, though there is increasing momentum to go for a charge of sedition, at least against the ringleaders and primary instigators.

Sedition, or seditious conspiracy, is defined in US law thus: “If two or more persons … conspire to overthrow, put down, or to destroy by force the Government of the United States, … or by force to prevent, hinder, or delay the execution of any law of the United States, or by force to seize, take, or possess any property of the United States contrary to the authority thereof, they shall each be fined under this title or imprisoned not more than twenty years, or both.” While this was not an attempt to overthrow the government, it certainly was a successful effort to “prevent, hinder, or delay” the certification of the 2020 presidential election, and it was also an effort to “seize, take, or possess” the US Capitol. 

Michael Sherwin, the acting US attorney in Washington, has assigned a team to look into charges of seditious conspiracy. Politico quoted him saying, “…our office organized a strike force of very senior national security prosecutors and public corruption prosecutors. Their only marching orders from me are to build seditious and conspiracy charges related to the most heinous acts that occurred in the Capitol.” Sherwin described the deep investigation into the organization of the mob that attacked the Capitol. This only highlights the fact that there was a huge electronic trail pointing to this event already, and that despite this overwhelming evidence, security at the Capitol was light. Nevertheless, this wealth of evidence will now be plumbed in an effort to support the most serious charges available, including seditious conspiracy. Sherwin also hinted at far worse crimes than those that we now know of, claiming that when the full extent of the criminal activity is made public, “people are going to be shocked.”

There is resistance to the idea of pursuing seditious conspiracy, citing past difficulties securing convictions on such charges. In 2010 a violent religious group in Michigan, the Hutaree militia, plotted to kill a police officer, then bomb his funeral, in an attempt to spark an uprising. A judge dismissed sedition charges. While there is a much stronger case here, NPR quoted Andy Arena, an FBI special agent who led the Hutaree investigation, saying “It’s hard to prove.” Arena is optimistic, but warns that it is a more difficult charge than many others. The problem in the case of the Capitol invasion will be the extent to which it was a previously planned and organized attack, as opposed to a spontaneous riot. There is ample evidence that many, if perhaps a minority, prepared in advance to breach the Capitol.

Previous successful convictions of seditious conspiracy include sheikh Omar Abdel-Rahman, the mastermind of the 1993 World Trade Center bombing. It is not a charge that is often pursued, partly because it is so difficult, and partly because, fortunately, it is rare in US history. 


In the past, sedition charges most often applied to speech, not action. Thus, for many, the idea of sedition conjures official attacks on First Amendment rights, efforts to suppress free speech by claiming that speech that undermines existing power structures is in fact inciting people to insurrection. It also has a long history of being used to block efforts to expand opportunities for marginalized communities. Sedition was used against abolitionists and free Blacks who sought to undermine slavery in the first half of the 19th century. With this difficult history in mind, some have argued that there are plenty of other laws and charges available to punish the Capitol invaders, and that avoiding sedition charges allows us to avoid conjuring up this painful history.

Whether or not sedition charges are ultimately brought, it is encouraging to see such a strong response to the violence at the Capitol. It is to be hoped that this will discourage a repeat event at the inauguration, scheduled for January 20. Such an attack could potentially be far worse than what happened at the Capitol. While security will be far greater at the inauguration, we can hope that many of those who may consider making the trip to Washington to disrupt the procedure will be scared off by the thought of decades of jail time.

The History behind the Selection of Deb Haaland for Interior

On December 17, 2020, President-Elect Joe Biden announced he would nominate Native American Deb Haaland to be the next Secretary of the Interior. Debate began immediately over Haaland’s fit for the position, but it could not detract from the historic occasion, the first Native American to head the department responsible for the US government’s relationship with Native American nations. 

The New York Times reported that Biden was leaning toward New Mexico Senator Tom Udall, before a concerted public opinion campaign convinced him to turn to Haaland. Udall supported the choice, saying, “President-Elect Biden has chosen an outstanding leader. She will undo the damage of the Trump administration, restore the department’s work force and expertise, uphold our obligations to Native communities, and take the bold action needed to tackle the accelerating climate and nature crises.” Conscious of both the immediate issues to be addressed and the historic nature of her selection, Haaland said, “It would be an honor to move the Biden-Harris climate agenda forward, help repair the government-to-government relationship with Tribes that the Trump Administration has ruined, and serve as the first Native American cabinet secretary in our nation’s history.” 

This is not Haaland’s first “first,” as both she and Sharice Davids of Kansas became the first Native American women elected to congress in 2018.

Speaking specifically of the role of the Interior in handling the US relationship with Native Americans, NPR reported: “Tribal consultation is basically nonexistent during this Trump administration,” Haaland said. “President-elect Biden has promised to consult with tribes, which I think will help immensely with some of the environmental issues that he wants to address.” What some see as an assault on public lands during the trump administration, others say is simply the most effective use of lands that otherwise would remain unused, a massive collection of untapped potential. The debate comes down to one of value, specifically a question over what about those lands is valuable- is it the potential dollars to be made from them, or the cultural, biological, and environmental value of leaving them undisturbed?

Haaland is an excellent choice to pursue Biden’s climate agenda. However, her fit for this is only partially due to her being Native American. There is a danger of believing that a Native American will help manage public lands because of the mythical belief that Native Americans lived in harmony with their environment. This myth arose from the lush, bountiful ecosystems Europeans encountered when they arrived. They believed this was a sign of native americans living more in tune with nature, closer to nature, and therefore less civilized. To this day this myth helps justify Westward Expansion and the problematic relationship between the United States and Native nations. Rather than evidence of closeness to nature, the lush ecosystems Europeans encountered were the product of generations of Native Americans’ direct, purposeful manipulation of their surroundings to create an environment that plentifully produced things humans found most useful.

The relationship between the United States and Native Americans did not begin with independence- before they became Americans, Americans were British, and had over 100 years of history of interactions with Native Americans. The same problems the British faced confronted the new nation: the United States was growing, and the land it needed to grow into was already inhabited. There were a variety of ways the early United States could have chosen to address this problem, everything from extermination to a fully cooperative relationship. 

The first Indian office was created within the War Department in 1824. Its mission was to solve the “Indian problem,” and the two methods to gain that solution were control and assimilation. Shortly thereafter, under President Andrew Jackson, the emphasis changed to relocation. In its various iterations, the office that would come to be known as the Bureau of Indian Affairs went through many changes in policy and emphasis. As with many such questions, these changes in approach reflected changes in government, and in public opinion.

The focus of this office continued to be addressing the “Indian problem,” the problem that Native Americans posed to the United States, rather than addressing any of the problems the United States had caused for Native communities.

A comprehensive list of the treaties and agreements the United States made with various Native nations, then broke, would be too long many times over. Historians have built good careers addressing just pieces of this relationship. 

Westward Expansion, and even the long tradition of the United States breaking treaties, brings up a fundamental question: Why doesn’t might make right? “Might makes right” can’t work when we see each other as equals, because if I can do it to you, someone else can do it to me. But if I place you into a different group than me, and cease to see you as an equal, I know that I can do as I please knowing that no one from my group will do the same to me, and no one from your group would be able to. This is how this country enslaved Africans and their descendants, and why that racial inequality exists to this day. This was also the attitude toward Native Americans. If they are not “us,” they’re “them.” Therefore, the rule of “might makes right” can be applied to them, without us having to fear that it could in turn be used on us. For many years, this was the choice that faced Native Americans: become “us,” or be destroyed. The fact that we were capable of destroying their cultures and their very lives was all the justification that was needed for the United States to subject Native Americans to decades of destructive policies. Whether by relocation, by assimilation, or by death, the United States sought to destroy Native American culture, and thereby destroy Native Americans as a distinct set of groups within American society.

It remains to be seen what Deb Haaland will bring to the Department of the Interior. It is important to not get carried away by the historic significance of her appointment, and attach unrealistic expectations to her tenure. Regardless of how it turns out, the choice to nominate a Native American is a huge symbolic gesture that hopefully indicates a new direction in US-Native relations, and a renewed commitment to addressing the struggles of Native communities.

History behind the Headlines: Christmas

It’s the most wonderful time of the year! Christmas is here, a time for gathering with family, singing classic songs, and spending a lot of money on gifts. Though it has become a secular celebration, Christmas is officially a Christian holiday celebrating the birth of Jesus Christ. Christianity has been around for a while, and this particular holiday has transformed over the years. Its roots can be found in pagan holidays that are even older than Christianity itself. 

The Roman Catholic Church fixed the date of Christmas in the early 4th century. It was set for December 25, the winter solstice in the Roman calendar. We now follow the Gregorian calendar, and though we kept the date, it is now a few days after the solstice.

Prehistoric civilizations were far more aware of seasonal and astronomical changes. This was in part by default, since the absence of light pollution allowed them to observe the sky more clearly, and the absence of anything better to do made sky-watching a popular activity. It was also a necessity; seasonal changes determined the timing of hunting and agricultural activities. The importance of dates such as solstices and equinoxes insured that they were marked, even celebrated, by prehistoric societies. Archaeological evidence of this importance can be found in sites such as Stonehenge. Monuments in the Stonehenge area date back to 8000 BC. Stone monuments aligned to the winter solstice date to 2600 and 2400 BC. At around the same time, structures in Egypt, the island of Malta, and elsewhere were built in alignment with the solstices.

In addition to needing to mark the seasons, there were, and are, cultural reasons for noting the solstices. The winter solstice, as the longest night of the year, has traditionally been a time to gather together, with family or as a community. With the darkness symbolizing that which is dangerous or evil, the longest night is best endured with company. It is a time to celebrate, as from this point the days will steadily lengthen. In prehistoric communities, the winter solstice was also a celebration of convenience, since this was often a time when excess livestock would be slaughtered, so they would not have to be fed through the winter. Despite it being the shortest day, traditionally the deepest, coldest, most difficult months of winter were just beginning. The winter solstice was a time to gather as a community, feast, and make the final preparations for enduring the most difficult winter months. Though the specifics of the celebrations varied across time and geographic location, solstice festivals were an important part of life in the northern hemisphere, especially in temperate zones.

The hugely influential theologian Saint Augustine, on the choice of the solstice, wrote in the late 4th century,  “Hence it is that He was born on the day which is the shortest in our earthly reckoning and from which subsequent days begin to increase in length. He, therefore, who bent low and lifted us up chose the shortest day, yet the one whence light begins to increase.” While Saint Augustine and others have provided theological reasons for celebrating Christ’s birth on the solstice, it was also a choice of convenience. Since its inception, Christianity has been evangelical, in that its practitioners have sought to convert others, and to grow the faith. Those Christians sought to convert already had their own beliefs, their own calendar of festivals and observances. When presenting Christianity to potential converts, it was expedient to be able to demonstrate that much of the religious observance was merely an adaptation, a shift in significance of holidays and festivals already a part of their culture. Many Christian holidays align with pagan festivals, and the celebration of these holidays was often a mixture of Christian and pagan practices. In the Roman world of early Christianity, the solstice was celebrated by a relatively new cult as the birth of the sun, called Sol Invictus. 

The Christmas of the Middle Ages would be unrecognizable to us today. Perhaps as a result of the blending of Christmas with the Germanic winter holiday Yule, Christmas became a drunken, raucous, carnival-like event. Understandably, for much of the history of Christianity, Christmas was considered to be a worryingly un-Christian celebration. Some felt it was insufficiently biblical, while others deplored the debauchery that had come to symbolize the holiday. Some banned the holiday altogether, such as the Puritans.

It wasn’t until the 1800s that Christmas was transformed from the drunken orgy of things un-Christian, to the family-centered religious celebration we think of as the archetypal Christmas. The transformation was centered in England, and was an intentional effort to transform Christian practice. At the forefront of this movement were authors, led by Charles Dickens. Pulling from Christian tradition, and inventing some aspects, a religious holiday centered around family time, charity, and social reconciliation was invented.

These days, the Charles Dickens interpretation of Christmas is threatened again. While Christmas has not returned to the drunken promiscuity of the Middle Ages, it is increasingly secularized. While some are quite combative in their defense of the “traditional” Christmas, the version invented in the 19th century, there is increasing pressure to recognize other faith’s holiday traditions. Christmas also joins the growing list of holidays that are under siege by capitalism, as Christmas becomes more about gifts than family, religion, or charity. It remains to be seen where this latest evolution of the holiday will take us.

The History behind the Headlines: Negro Leagues

There was a wonderful story that popped up recently, then was quickly buried under the deluge of pandemic, election, and other news. Major League Baseball formally acknowledged Negro League players and teams as Major League players and teams, and incorporated Negro League stats into the MLB recordbooks. I remember learning about the Negro Leagues as a kid- my brother and I even had some Negro League posters up in our room, though I can’t remember now where we got them. It did not introduce me to historic discrimination, but it did provide food for thought. These days we see a disturbing amount of headlines about various continued strategies of racial discrimination. This is great, because it means that this ongoing problem is at the forefront of the national conversation. But the volume of discussion about how far we have to go can sometimes obscure how far we’ve already come. Additionally, the narrative of the Negro League as a story of discrimination obscures the story of the Black community in America adapting to the constraints placed on it by White society, and creating something amazing in spite of efforts to hold them back. 

All Black baseball teams began not long after White teams, though it would take longer for them to organize into leagues. One of the earliest successful leagues was the Southern League of Colored Base Ballists, begun in 1886. Baseball itself took off just before the Civil War, in the 1850s. The National Association of Base Ball Players formed in 1857, and 10 years later, just after the Civil War, it banned Black members. Black teams played primarily exhibition games before regional leagues began to form in the late nineteenth century. Often, Black teams would play White semi pro teams. 

Financial difficulties forced nearly every early Black team and league to fail, a fate also experienced by many White teams. Black baseball stayed alive in the form of barnstorming teams, who would travel around, playing against any team that would take them in exhibition matches. The golden age of Negro baseball began in 1920, when eight teams joined to form the Negro National League. These included some of the best known teams, such as the Kansas City Monarchs. It competed with the Eastern Colored League, and the two combined for an annual World Series several times in the 1920s. 

The perseverance required to make these teams and leagues successful is inspiring. Though they occasionally played White teams, Black teams’ fans were primarily Black, and in the late 1800s and early 1900s, Blacks in America had little disposable income to spend on a sports team. The players themselves never had any sort of financial security, since teams and leagues endlessly formed and folded. The Negro National League folded with the start of the Great Depression, and throughout the 1930s, any attempts to create leagues failed. Nevertheless, the players continued to play, in whatever venue they could. 

Despite the financial failure of Negro Leagues, Black baseball players demonstrated that they were just as talented as Whites, sometimes more so. Owners of Major League franchises saw the Negro Leagues as a huge pool of untapped talent. World War II, and the involvement of Black soldiers, began to change viewpoints toward integration. With the 1944 death of avid segregationist commissioner Kennesaw M. Landis, Major League Baseball began a path to integration. At this point, the well known story of Branch Rickey recruiting Jackie Robinson begins. Robinson signed in 1945, even though integration was still two years away. Four more players were signed shortly thereafter, though none made the same splash as Robinson. Jackie Robinson debuted for the Brooklyn Dodgers in April 1947, despite fierce opposition from the owners of most of the teams in the league. 

Integration was the end of the Negro Leagues. Major League teams began mining Negro League teams for talent, and it did not take long for the best Black players to move to the Majors. Already a dubious financial investment at best, Black teams and leagues disappeared by 1954, unable to maintain a following. 

The importance of the Negro Leagues to the development of baseball was recognized in a variety of ways. Stars such as Ted Williams advocated for the inclusion of Negro League stars in the Hall of Fame. A separate (but equal) display was planned, but the uproar was sufficient that in 1971 the Hall of Fame relented, and the first Black player admitted was Satchel Paige, a star of the Negro Leagues. Further recognition included honorary inductions into the Hall of Fame, and a postage stamp series. While it may seem like a small adjustment to some very old record books, perhaps the greatest step in recognizing the role of the Negro Leagues was the recent inclusion of Negro League players in Major League record books, officially removing the last barrier between the Negro Leagues and Major Leagues, completing the integration of baseball.

The History behind the Headlines: Pandemic!

Covid 19 has been in the headlines for almost a year now. Even when it was subordinate to election news or other stories, there was never a time when you could not find news about coronavirus. 

Lots of competing theories about what covid actually is, or what it isn’t. I was recently told that it is part of a campaign to soften us up for imminent takeover by the shadowy “New World Order.” Many believe that “they” have made it into a bigger deal than it is, though who “they” refers to varies depending on who you ask. Many who acknowledge that it is in fact a real virus nevertheless feel that it is an opportunity that has been seized by one group or another to further certain agendas. Meanwhile, health care professionals scramble to contain the epidemic, imploring people to take simple steps like wearing a mask, and to make more serious sacrifices, like limiting contact with people. 

The most common comparison is to the 1918 flu, commonly called the Spanish Flu. The Covid epidemic has not yet reached the same proportions as the Spanish Flu, which infected an estimated 500 million people worldwide, with 50 million deaths. Deaths in the United States reached 675,000. So far Covid has infected about 73 million people worldwide, with 1.6 million deaths, though the stats for the United States are approaching 1918 levels much more quickly than worldwide.

The Spanish Flu was devastating. The 500 million estimated infected was approximately a third of the population of the world. It struck at the end of World War I, compounding the misery. Though the earliest cases were reported in the United States, then European countries, World War I censorship prevented widespread discussion. It was only in neutral Spain that the sickness was well-publicized, leading to the misconception that it began there, or at least was especially bad in Spain. Combination of environmental factors and the malnutrition and poor hygiene resulting from the war led this flu to spread more quickly, and to cause more deaths than usual, though there was no evidence that the strain was more virulent than others. Lack of antiviral or antibacterial medicines meant there was no effective treatment. Outbreaks continued into 1920. 

Without question, the deadliest pandemic in world history was the variety of diseases, especially smallpox, that virtually depopulated North America, and also killed millions in Central and South America, following the large scale contact between residents of the Americas and invading Europeans beginning at the end of the 15th century. The scale of death from the Columbian Exchange is contested, but some estimate that 95-99% of the indiguenous population of North America died from disease, or from the warfare, malnutrition, and other effects of the pandemic. Even more than percentages of population killed, the actual number of deaths is hotly contested, and highly politicized. The higher the number, the worse the tragedy. Those who wish to justify expansion by Europeans, then by Americans, promote lower numbers, while those who do not want to see native culture erased or trivialized point to evidence of much higher populations. Estimates of the total deaths directly caused by European contact range from 50 million to 100 million people.

The only rival of the depopulation of the Americas, not only in terms of deaths but of impact on world history, is the Black Death. The first recorded wave of bubonic plague, the Plague of Justinian, struck Europe and the Middle East between the 6th and 8th centuries. Some cities saw death rates of up to 10,000 a day. It’s difficult to arrive at firm numbers, but is estimated that between this plague, its societal side effects, and other issues in Europe, the European population saw a 50% drop between 550 and 700. Terrible as it was, it was completely overshadowed by the return of bubonic plague in the 14th century. Known as the Black Death, it killed somewhere between 75 and 200 million people between Europe and Asia. 

The Black Death ushered in the modern era. The devastation in Europe was so nearly total that it prompted a radical change in social and economic structures. Watching nobility die just as easily as peasants led common folk to question rigid hierarchies. Europeans were forced to adapt, and a centuries-long stagnant period was broken. The societies that emerged from the Black Death were much more dynamic, and it is no coincidence that the era of exploration and colonization followed soon after. The dynamism continued into the industrial revolution, the scientific revolution, social revolutions, and into the world we have today.

Equally important to world history, though much less well-known, is the impact of the Black Death on Asia, especially on China. We have less reliable statistics for Asia, but there is every reason to believe that infection and death rates were just as high there as in Europe. It struck at a time of weakness and division in China. Coupled with the invasion of the Mongols, it completely changed the course of Chinese history. While the Black Death led Europe to a more dynamic society, in China these events led to a fundamental inquest, with the result being a return to a more staid, traditional approach. Before this period, the Chinese tribute fleets of Zheng He sailed as far as Madagascar. Under the return to conservative, inward focused Confucianism, China destroyed its fleets. Thus, when Portuguese explorers rounded Africa into the Indian ocean, they found nothing more formidable than a local fishing fleet, where they once would have encountered fleets the scale of which would not be seen again until World War I.

It remains to be seen what effects the Covid pandemic will have on our society. Improvements in healthcare, and the imminent vaccine, ensure that the scale of this pandemic will not approach the 1918 flu, much less the more devastating Columbian Exchange or Black Death. But this does not mean that we will go back to normal. Will covid 19 be seen as a moment when something fundamental in our world shifted? Or will we forget about it soon after it’s gone? Some industries, such as movies, may have to change completely. Many of us are working from home, and we have yet to see how many will never return to the office, having demonstrated the efficiency of remote work. This may change the way we think about and approach communicable diseases, even just the common cold. This will not be a world historical landmark like the Black Death, but still, we may never see the pre-Covid world again.

The History behind the Headlines: Presidential Pardons

One of the fascinating ongoing stories of the Trump administration is the frequency with which his associates found themselves convicted of crimes. One of the first was former national security advisor Michael Flynn, who pleaded guilty to lying to the FBI in the course of the investigation into Russian interference in the 2016 US election. The president, via Twitter, announced on November 25 that he had granted Flynn a full pardon. Previous Trump pardons include controversial former Arizona sheriff Joe Arpaio. He also commuted the sentence of advisor and friend Roger Stone, also convicted in the course of special counsel Robert Mueller’s Russia investigation. While the president can use the pardon at his or her discretion, the pardon of Flynn raised some eyebrows. Critics, especially high ranking Democrats, have suggested that Trump has used his pardon power to reward those who lied on his behalf. More pardons are expected in the final weeks of the Trump presidency, many of which will surely raise the ire of those eager to find fault with the president’s actions. Trump’s pardons, while unsavory, pale beside some past pardons.

The earliest high profile pardon was by George Washington on the final day of his tenure. Washington pardoned leaders of the Whiskey Rebellion, a tax protest led by veterans of the Revolutionary War. In need of funds to pay war debt, the federal government taxed the production of spirits. Whiskey was rapidly growing in popularity, and farmers on the western frontier often converted surplus grain into whiskey. Believing themselves to be fighting for the principles of the Revolution, especially against taxation without sufficient representation, farmers violently resisted the new tax. Despite thousands of participants, many of whom were captured, only two were convicted of treason. Washington pardoned them, wrapping up this early challenge to federal authority.

The most controversial use of the pardon was at the end of the Civil War. It was a time of intense division. Though the war was over, and slavery with it, the strong feelings and beliefs that led Americans to take up arms against one another did not disappear. By definition, every Confederate survivor had committed treason, and was thus subject to execution. While only the most bloodthirsty northerners sought capital punishment, there was certainly strong sentiment in favor of harsh punishments. Abolitionists and others with incentive to see southern society destroyed saw this as their moment to make the defeat a total one, to completely impose a northern way of life on the south. Others, especially president Andrew Johnson, wondered how best to move forward in what was still a united nation. Despite having been defeated, southerners still held onto their pride, and to insult that pride by adding punishment to defeat could only further alienate them. After four years of war, many preferred peace over punishment. It is also important to bear in mind the racist motivations for reconciliation; slaves were freed as part of the war, but very few in the north actually cared about the fate of the new freemen, and fewer still wanted responsibility for the freed slaves. Who better to take on this responsibility than those who already had it? The effect on Blacks of allowing former Confederates to walk free was easily secondary to the desire for peace and reconciliation. Johnson declared a general amnesty, exceptions to which included Confederate government officials and those who left the federal armed services to join the Confederacy, among others. Those who did not qualify for the general amnesty were still able to request a special pardon, and the Johnson administration devised an oath each applicant was required to swear in order to receive their pardon. Johnson’s sweeping pardons succeeded in creating a path to a reunified nation, but papered over many of the underlying issues, especially race relations, some of which continue to divide Americans today. 

The most notorious presidential pardon was certainly that of Richard Nixon by Gerald Ford. Nixon had not yet been convicted of any crimes, but had been forced to resign in the wake of the Watergate scandal. Ford unconditionally pardoned Nixon of any and all crimes he may have committed against the United States while president. Similar to Johnson’s pardons of Confederates, Ford claimed that his pardon of Nixon was in the best interest of the nation. Ford justified his pardon by the 1915 Supreme Court case Burdick v. United States, the decision of which stated that a pardon carries an imputation of guilt, and therefore its acceptance carries a confession of guilt. Ford pardoned Nixon despite knowing it would be unpopular. While Nixon’s crimes were not particularly heinous, he was the first, and to this day the only, president to have been caught “red-handed” in clearly illegal conduct while holding the office. It was a tremendous offense against Americans’ ideas of the integrity of the office of the president, and it was perhaps Nixon’s apparent lack of respect for the office that angered people more than the acts themselves. The prestige of the office, and the faith Americans placed in that office, took damage that has not yet been repaired.

The power of the presidential pardon may soon see its ultimate test, as there is significant speculation that Trump may pardon himself before leaving office. Debate among experts over the legality of such a move is often overshadowed by distress and disgust that a president would consider pardoning himself, a notion that seems more at home among third world dictatorships than in the United States of America. Trump’s allies rightly point out that there is no limitation placed on the power to pardon, and nothing in the Constitution or in any legal precedent states that he cannot pardon himself. Since the president can only pardon for crimes against the United States, and has no power over state crimes, Trump remains vulnerable to his current legal trouble in New York, and elsewhere.

The History behind the Headlines: Contested Elections

President Donald Trump and his supporters refuse to acknowledge the reported results of the 2020 presidential election, seeking to turn this into another contested election. Many of us have not-so-fond memories of Florida’s chads from 20 years ago. But while this moment feels tense, with multiple accusations of fraud, it is nothing compared to the most sharply contested, and most fraudulent in U.S. history: the 1876 Hayes-Tilden election.

In the two weeks after the election, Trump posted over 300 tweets about the election. These included both tweets claiming he had won fairly, and accusations that the election was fraudulent. His legal team, led by Rudy Giuliani, has launched numerous legal actions, none of which has been successful. Hearing Trump’s claims of fraud and a stolen election, I can’t help but be reminded of the election of 1876.

Rutherford B. Hayes really should never have become president. Initially after Ulysses Grant’s two terms as President, the most likely Republican candidate was Grant himself. After Grant’s decision not to run, the next obvious choice was Congressman James G. Blaine. Late in the campaign, Democrats in the House of Representatives opened an investigation into Blaine’s dealings with a railroad. Damning letters were produced, and Blaine gained no popularity by securing the letters himself, then refusing to hand them over. Nevertheless, he still entered the Republican nomination as the frontrunner, with five others considered serious competitors, including Hayes. On the first ballot, though he received 285 votes, and his closest rival only 124, Blaine was still short of the majority of 378. Subsequent votes saw his tally increase, and the gap grow. Hayes was low in field, but significantly his tally increased with each subsequent vote. If the anti-Blaine faction could agree on a compromise, their combined votes would reach a majority. That compromise came in the form of Rutherford B. Hayes. While the final tally saw Blaine rise all the way to 351, Hayes jumped from 113 to 384, narrowly winning the nomination.

Hayes faced New York Governor Samuel J. Tilden in the general election. Tilden won the popular vote, and had 184 electoral votes, just one shy of a majority, to Hayes’s 165; in Florida, Louisiana, and South Carolina each party claimed victory, while a single elector from Oregon was ruled out and had to be replaced. Tilden needed only one state, or even just the one elector from Oregon, to win the presidency, while Hayes needed to run the table. As a Democrat, Tilden performed very well in the south, comfortably winning each former Confederate state, except the contested three, which happened to be the final three states still under federal control as part of Reconstruction after the Civil War. By the morning after the election, nearly everyone, even Republicans, had acknowledged Tilden’s victory, believing the final counts to be a mere formality. However, John C. Reid, managing editor of the hugely influential New York Times, telegraphed the Republican managers in each of the three states, informing them that all was not lost, and asking, “Can you hold your State?” Each dutifully reported a Hayes victory, and the Republicans announced that Hayes had won.

One can only imagine the chaos of these vote counts. Unlike today’s electronic records and instant communication, ballots were paper, and news travelled slowly. President Grant ordered federal troops in the three southern states to maintain order while the vote counting proceeded. Ultimately, two contradictory certified returns were sent to Washington from each state, and there was no guidance in the Constitution for how to address the crisis. Some Republicans wanted the Executive to decree the results, and defend them with the military. The controversy dragged on into the new year. Congress created a fifteen member committee, comprised of seven Republicans and seven Democrats. The fifteenth was supposed to be an independent, but at the last moment he was unable to serve, and the committee chose an eighth Republican in his stead. 

The hearing dragged on. Democrats demanded that the committee investigate the legitimacy of the votes themselves, while Republicans insisted that the only purpose of the committee was to determine which of the conflicting returns was the one properly certified by the state board. Of course, in the end, it came down to Florida. The Republican-led electoral commission rather blatantly threw out enough Democrat votes to see Hayes received the majority. A cursory investigation by the electoral commission would have exposed the fraud. Yet, instead of standing firmly behind their demand for just such an investigation, Democrats calmly allowed each contested return to be ruled in favor of the Republicans, each time by a solid 8-7 partisan vote. Each decision was ratified by not only the Republican-led Senate, but also the Democratic House. Why did Democrats allow it?

It is at this point that the story moves from official, documented, public information to the realm of shady, behind-closed-doors deal making. While there is no hard evidence to support the theory, it is widely believed by historians that Democrats allowed the committee to rule in favor of the Republicans in exchange for an end to Reconstruction. The removal of federal troops from the south was the biggest point of the compromise, but it also included cabinet appointments, a transcontinental railroad, and, in a betrayal that would shape race relations forever, the implicit promise of Republicans not to interfere with White southern Democrats’ treatment of still newly freed Blacks. Rutherford B. Hayes, beneficiary of two compromises, neither of which could be said to reflect the will of the voters, was inaugurated on March 5, 1877.

The History behind the Headlines: Thanksgiving

It’s that time of year again! We gather with family- smaller gatherings this year- stuff our faces, watch some football, and maybe, if it’s still in the family tradition, talk about what we’re thankful for. But in recent years, there has been a bigger push, especially in our schools, to dig a little bit deeper into why we gather, what we actually celebrate, and what really happened back in 1621. This has led to some pushback, especially among lawmakers. Tom Cotton of Arkansas recently claimed liberal “charlatans” were rewriting history, stating “Too many have lost the civilizational self confidence needed to celebrate the Pilgrims.”

We all know that the pilgrims came here for religious freedom, right? What many of us don’t know is that they actually fled Europe in search of a place where they would be permitted to practice extreme religious intolerance. Yet those, like Cotton, who do not wish to see the Pilgrims’ good name besmirched, do have a point. The voyage across the Atlantic required tremendous courage. When they were forced to land on Cape Cod, rather than their intended destination further south, they took the extraordinary step of drafting and signing the Mayflower Compact, which organized them into a body politic, and became the document by which they self-governed. Despite a high mortality rate, the colonists survived, with the help of the Wampanoags. This story of gritty survival against the odds is a heroic tale, one not undeserving of celebration. But the pilgrims were only one group that attended that famous harvest feast in 1621, and relations between the groups were not as friendly as the traditional portrayal would have us believe.

The pilgrims did not discover happy Native Americans living in a paradise- they discovered a shattered community still reeling from a deadly plague, most likely smallpox. The Wampanoag did not aid the pilgrims out of simplistic kindheartedness, but out of a knowledge that their world was changing rapidly, and a new ally might come in handy. Even as their numbers dwindled due to poor nutrition and inadequate shelter in the cold, the new arrivals were still able to mount armed expeditions against other Indian groups on behalf of the Wampanoags. Their militarism increased as new colonists arrived. Weakened by sickness and intertribal conflict, many native villages were abandoned in the face of the colonists’ aggression. A decade after they landed, the colonists embarked on an extermination campaign, burning villages and killing hundreds of Pequots. Governor William Bradford announced that from then on, Thanksgiving would be a celebration of “the bloody victory, thanking God that the battle had been won.”

So perhaps we have the history of what we celebrate a bit wrong, but that doesn’t mean we shouldn’t celebrate, right? After all, Thanksgiving is about giving thanks! But is it? Our Thanksgiving is a variation of a traditional harvest festival, celebrating the season’s bounty before the hard winter. We’ve come a long way from celebration of a harvest. The date of Thanksgiving was fixed as the fourth Thursday in November by Abraham Lincoln in 1863, in celebration of Union victories in the war. Now, American Thanksgiving is more often described as “food, family, and football” than as an opportunity to give thanks for a harvest or a victory. In recent years it has ceased to even hold that meaning, as Black Friday shopping deals have steadily encroached on the holiday. At least we can all be thankful for the great deals we’re getting.