How does History?
- Michael

- Mar 13, 2019
- 9 min read
Updated: Mar 15, 2019
One (slightly) academic essay. Just one. We don't often ponder what we mean by "history." I propose 4 ideas, all of which will surface by turns on this site.

How do you think of "history"?
I recently read an academic blog post ([Re]Collection: Central Michigan State) by professional librarian Angelo Moreno: "Nostalgia is not History." Indeed it is not!
TLDR: elite high school students in Mexico City elicited nostalgia for the Porfiriato. Presumably, the author's concern was that comfortable social status blinded these students to the complex realities and unseen underbelly of this troubled time. The historian's task is to hold her students (i.e. contemporaries) to account, to insist that each "check her privilege" as we say nowadays, and to examine the past with an eye to cultivating justice and peace in the present and future. In a word, the historian's work is moral.
This, of course, is one perspective. It is high-minded and noble.
And I don't entirely disagree. The integrated--even "integralist" thinking--in this vision has a familiar ring to it. It forms that basis of the Catholic theological and liturgical training I have experienced throughout my life.
Apart from the obvious religious ends of this study,, theology and philosophy demand definitional precision, procedural rigor, and clarity of thought unparalleled outside the disciplinary austerity of the scientific method. These tools form the foundation of the Western intellectual tradition and still serve us well today when we choose to wield them. They benefit anyone.
...which is why my reflection on the notion of "history" comes from a philosophically-oriented theologian.
Writing in 1955, Paul Tillich observed that religious faith requires the whole person: "will, knowledge, and emotion...the trinity of forces which operate in a complete personality." English musicologist Erik Routley starts his 1968 book "Words, Music, and the Church" with this idea. But, because Routley possessed an enormous intellect, he noticed something interesting and convincingly took it in a totally different direction.
Routley's Summary
Routley observed that when the order was reversed--emotion, knowledge, will--this list roughly corresponded to the general preoccupations of artists over the 150+ years leading up to 1968. He proceeds to make his argument with excellent musical examples--analyzing their compositional style through this lens.
Since music is a cultural practice, I suspect that the "30,000 foot" trends surrounding it would apply to other contemporaneous human endeavors...endeavors like literature, or the writing of history.
At the risk of grossly oversimplifying what is already a generalization, here is a summary:
1) Romanticism:
is present-centered; this is not far from saying that it is the culture in which emotional response is paramount...Romantic writers, philosophers, and artists devote their positive attention and statements to the present; the past and the future are illustrative--they are not objects of study in their own right.
Romantics rarely felt burdened by facts, especially when they weren't expedient. Routley highlights Sir Walter Scott's fiction: "it was nothing to him that his history was not precise." It was all about evoking a feeling regarding the *present* Scotland by whatever means necessary--including construction of dubious pasts. The facts about the historical King Arthur were immaterial. We talk a lot about "inconvenient truths" and "fake news" nowadays. These didn't bother Romantics.
2) With Positivism, the pendulum swung hard the other direction. This was the era that beget "the cult of the "primary source"." Routley describes it as the era of "knowledge," characterized by a "thirst for objective historical precision." Musicians and all sorts of scholars spent decades pouring through ancient, forgotten sources and carefully reproduced and reprinted mountains of buried data with an eye for precision and accuracy. Not surprisingly, "primitivism is the mark of the phase of knowledge. You have no business, say its exponents, to be ignorant of the past; but neither should you see the past only in terms of your own age" (dig at Romantics here!) Like every other phase, this mentality had its "Achilles heel:"
Knowledge sought to discipline emotion. But there was a fatal weakness here. The exclusive cultivation of knowledge, with its past-centered habits, can lead to a kind of death wish, a failure to come to terms with the present, a negligence about the future.
Which set the stage for the next reactive trend...
3) Modernism looks forward. "The idea that priority must be given to what is past, or to what is given (in the philosophical sense) has yielded place to the idea that what matters now is the future." The will to effect change matters more than knowledge or emotion as such. Modernism has no "sacred cows." Routley indeed notes that "Existential approaches to philosophy are equally anti-romantic and anti-historical."
As Routley was writing in the late 1960s, we now see that he was perched on the threshold of yet another trend--one that took hold and has hardly released its grip since:
4) Post-modernism (or really, Post-structuralism). Routley almost gets sucked in, grazing the tip of the proverbial iceberg when he summarizes that
Linguistic philosophy examines the properties of words, not for what emotions they arouse, but for what they communicate--what happens when they move in time towards the future. It is neither a dogmatic system nor a private experience to which the linguistic philosopher refers the words he [sic] analyzes; it is to the ongoing community of speakers and hearers.
Communication = message sent and message received by a community of communicators. Seems simple, innocuous, but it was more dogmatic than Routley realized. What Routley doesn't describe is how greedy the apologists of this theory became. Linguistic communication was not merely *a* way to transmit reality, it was *the* one and only way to mediate reality. When this outlandish claim combines with the premise of inescapable subjectivity, it follows that meaning becomes something wholly constructed (perhaps "negotiated" between parties) rather than something even *possibly* a priori or objective.
This idea crystallized in Continental philosophy and jumped "the pond" to the American academy, initially appearing as philosophical Post-structuralism and subsequently in English departments as "the New Criticism." It has since found its way into virtually every liberal art and soft science, spawning a plethora of new sub-disciplines. Because subjectivity spans a domain as vast as people are numerous, the vantages are virtually limitless--hence, the rise of interdisciplinary and intersectional analyses.
At best, these new approaches offer fresh, unseen angles on age-old problems. At their worst, they mask a fundamental absence of procedural rigor. (As a decorated Notre Dame professor once quipped: "interdisciplinary scholarship often seems to be heavy on the "inter-" and light on the "discipline!") Piling up anecdotes cannot substitute for precise, systematic thought. Juxtaposing vastly incongruous ideas rides a fine line between the brilliant and the absurd. It doesn't surprise me that we lampoon academia in our era. Even the characteristics of scholarly jargon do not escape satire!
If academic prestige peaked in the 1960s--picture those white-coated professors, high priests of the Space and Nuclear age who commanded social capital and clout--it has bottomed out in the early 21st century. Academia has rendered itself more or less irrelevant, thanks to Post-Structuralism.
And I doubt that Post-structuralism will wane anytime soon because it works like a cleverly concocted semantic parlor trick. The more you try to solve it, the more it will flummox you (think: Chinese Finger Trap). If Modernism seemed robust (with all its exuberant Post-War optimism) Post-Structuralism feels like a mopey jog in a epistemological rat-wheel. I have my Inescapable-Subjectivity-and-Bias and you have your Inescapable-Subjectivity-and-Bias, so let us volley our vantages and see if something mutually agreeable results from the exercise.
I digress.
So, does any of this apply to how one conceives of history...to how, for example, people have done genealogy?
I think so.
Four sorts of genealogy
Early 19th-century men [sic] of leisure in New England passed the time writing the stories of their supposedly illustrious Mayflower forbears. No doubt, sources were consulted, but there was a lot of malarkey mingled too. Romantics saw the past through "rose colored glasses" and it was indeed no surprise that the squalid reality of a 17th-century dissident religious sect in a brutal climate underwent cosmetic enhancements. (e.g. the First Thanksgiving story). Some genealogies even contained fake lineages, tying the middle-class Puritan peasants to the illustrious lines of Plantagenet blood, stretching back to William the Conqueror or Charlemagne. Fake news. But America didn't have royalty and this was how the bourgeois scions of aspiring captains of industry bolstered their legitimacy in a rough-and-tumble young Republic.
From the late 19th century through the early 20th century, countless individuals, civil offices, and ecclesiastical entities piled up vast stores of information, of knowledge. I suspect that the scope of this work was unmatched in human history up to that time. Technology helped: typewriters, carbon copy, mimeograph, microfilm. What was the particular utility of any given piece of information? Who knew! But more importantly: who cared?! This was when the LDS began duplicating and archiving other institutions' archives in earnest. Local historical societies and government agencies churned out chronicles and archival indexes by the thousands. During the Depression the federal government sent WPA workers to interview elderly citizens and copy headstone engravings in cemeteries.
Data sets. Surveys. Minimal spin. My grandmother was a virtuosic genealogist of this sort. She kept scores of binders, stuffed full of funeral cards spanning 60 years. She could rattle off the names, birth dates, and death dates of most people in her county and then articulate their various connections to each other. It always impressed me. It was a history of nouns, utterly devoid of verbs and adjectives, much less adverbs.
Modernism in genealogy? Here's an example, albeit more contemporary than the philosophical trend: "resistance genealogy." It definitely uses facts and it often elicits emotions but it primarily aims to improve the world by exposing political hypocrisy and forcing a change of heart (and/or policy). Noble aims aside, I don't know that this works well. It boils down to a labor-intensive "gotcha game"--and people are notoriously averse to repentance when being shamed or humiliated into it. "You catch more flies with honey than vinegar..." Nevertheless, it's out there.
Post-structural, Post-modern genealogy? As much as I recoil at the havoc wreaked by pseudo-scholarship in the humanities, I use tools from this palette now and then. In fact, I confess a soft-spot for it. One of my most influential academic mentors--a maverick medievalist--describes herself as a "postmodernist and a medievalist." Incongruous? Possibly. But I think they meet in the idea of "writing history."
Sometimes sources are missing. One could fill the hole with BS (Romantic) or one could attempt to span the gap with inference (logic, scientific). One could merely step over the hole and go on (Positivist/knowledge). Sometimes there is a dizzying abundance of sources. Pruning them is a creative, editorial act. It is also a necessary act. The final product will represent truth--"nothing but the truth"--but certainly not "the whole truth." The point of writing about a 12th-century Parisian poet--much less one's 19th century ancestor--is not to clone the entire objective reality of the subject before our eyes. The point is to communicate something interesting that is also something true. A total retelling is impossible--hence the notion of "writing history."
My evolution
I have wrestled with this over time. When I was 12, I received a family tree from a great-aunt. It was filled with names, dates, and a few biographical bullet-points. I interviewed my great-aunt and documented the information redundantly in narrative form. I found the stories fascinating and accepted them at face value. Romantic?
Over the next several years, as I verified claims against archival sources, I found that some dates and many biographical tidbits were completely wrong or, at least, a little mixed up. Some things simply can't be proved or disproved--best not to transmit these things, as an elderly Mormon lady warned me. My takeaway: discard the "wrong" information that I had received and build an updated, "accurate" 2.0 version that would be my launchpad henceforth. I suppose this was my "knowledge" phase.
About 8 years ago, I found some remarkable sources (thanks to GoogleBooks and OCR technology) that definitively confirmed and clarified an obscure family legend. I dove into my boxes of notebooks and was relieved to find that I hadn't actually junked my initial notes. At that moment, I could see precisely how this particular anecdote evolved, devolved (mutated?) in the course of the 5 generations of transmission.
It was a rare, exhilarating, defining, and illuminating moment and the lesson was clear: save everything. Even "wrong" information becomes meaningful information when people accept it, believe it, and share it. The story of the botched story itself became a family story. Fact lovers cringe but a true account can be built with generous helpings of "fake news." (Incidentally, this essentially summarizes 2000 years of Christian liturgical history--no hyperbole: I'd go to bat defending this claim...lots of "telephone"!) This messy, dynamic process has become my favorite aspect of any historiography, genealogy included.
Which are you?
Do you love retelling beloved legends, unhindered by factual detail?
Do you chronicle the facts and stockpile data?
Do you use the research to make a point for today or tomorrow?
Do you self-consciously view the process itself as a product of sorts?--sharing whatever you find, be it good, bad, ugly, or fake?
I suspect most flesh-and-blood humans are some hybrid of the above.



Comments