Beyond the Suspension of Disbelief: James Getty and the Power of the First-Person Interpreter

Last week I gave a talk to Museum Studies students at my school about the perceived and actual differences between academic and public historians as a way of introducing my course in the spring, The Practice of Public History. During my talk, to a fantastic group, I covered a wide range of topics and ideas, partly driven by my impressions of the implications of recent reports that revealed history-related institutions make up more than half of all museums, yet account for less than a fifth of the total financial resources devoted to cultural and heritage organizations.  I also talked about the importance of evaluating programs and establishing a clear definition of success that included a discussion about fitting different modes of interpretation (first-person costumed, third-person, tech-driven, etc.) to one’s ostensible purpose.  That’s because when done well, first-person interpretation, complete with buckled shoes and frilly shirts, can be a powerful tool of public history, reaching and impacting more people, and shaping their understanding of the past, than might ever read the books written by Bernard Bailyn, Gordon Wood, Pauline Maier, and Jack Greene (my own mentor), combined.  Conversely, when done poorly, when the facts aren’t straight and the persona is off, it can create enormous damage to an observer’s historical consciousness, and drive more dismissal of the tactic, even of the entire field, by “proper” historians, whether they lecture in an august classroom or muse in a comfortable armchair.


The example I used of first-person interpretation at its best was Bill Barker, who portrays one of the more difficult–because tremendously complicated–historical figures, Thomas Jefferson, at Colonial Williamsburg, Monticello, and elsewhere.  As I know quite well, having once been his neighbor and chief annoyance at CW (which almost resulted in the burning down of an historic structure, but that’s a different story), Bill is not only an expert on his subject on a level at, or even greater than, many Jefferson historians of my acquaintance, he also knows what that ill-defined, undifferentiated mass we call the “public” expects from him as Jefferson, complexities and all.  And he does a tremendously good job of bridging the exceptionally problematic gap between Jeffersonian fiction and fact, which is saying quite something because there is quite a bit of the former and nowhere near enough of the latter when it comes to the tall Virginia redhead.  Bill knows his craft and the impact it has on his audience, whether they have met him for the first or hundredth time, and he takes that responsibility with the utmost seriousness.  That’s because he is fully aware of the trust he assumes whenever he takes on the persona of Jefferson; he has the power to inspire and engage, but along with that comes the potential to disappoint and derail.

homeimageI wasn’t thinking of Bill, though, when I actually got up to talk last week.  I was thinking about another interpreter of an American president, one perhaps not as well-traveled as Bill, but one just as professional and every bit as influential, and who I, sitting there in a Harvard lecture hall, had just been informed by my niece had passed away at the age of 83: James Getty.  For almost 40 years, Getty portrayed Abraham Lincoln in Gettysburg, Pennsylvania, for much of that time at his studio on Steinwehr Avenue, “A. Lincoln’s Place.”  And it was there, on the evening of September 27, 1978, that an eight-year-old from Baltimore, who before then had spent more time on horseback than in a library, became an historian.  To avoid the risk of turning this post into a maudlin personal memoir (“too late,” you might say), I’ll just relate that my parents and grandparents, wanting to nurture my nascent interest in the past, arranged a “private audience” for me with Getty as Mr. Lincoln.  And for a time, I was enthralled by his kindness and his knowledge.  For an hour or so, 1978 turned into 1863 and there was no one IMG_0008else in the world but me and the 16th President of the United States, with my family playing the role of an indulgent audience to what must have seemed an odd discussion between a blonde-haired, blue-eyed boy in a striped shirt and a tall man with a black beard and a stovepipe hat.  I was never the same after that evening.  I devoured works on Lincoln and the Civil War, returned countless times to Gettysburg and other battlefields, and then moved on to study the American Revolution and the ideas behind what happened “Four score and seven years” before my ostensible meeting with Mr. Lincoln.  I also continued to carry with me the tremendous gift that he had given me, of being able to suspend one’s disbelief to become, even briefly, a time traveler, and thereby, as an historian, better grasp the perspective of people I’ll never really know but can attempt to understand on their own terms, as people in their own world, one much different, in many but not all ways, from our own.

And that, in a nutshell, is the power of the first-person interpreter, and also the responsibility that accompanies it should a public history groupprogram developer or individual interpreter choose to take on such a role.  The clothes do not make the historical man or woman.  As I became more educated about Lincoln, I became more impressed over time with what I remembered about James Getty’s portrayal and how much he got right.  I also recognized what I do not remember: Whether his boots were the right cut or his buttons the right size.  What was more important, and just as Bill Barker accomplishes as Jefferson every day, James Getty created an authentic historical moment as Lincoln, even for one otherwise unremarkable little boy.

In that sense, costumed portrayals, whether as a famous or forgotten person of the past, come with a high cost in terms of education and training about both the subject and, critically, the ethics of it.  I personally tutored CW’s James Madison (the splendid Bryan Austin) for several grueling months before we let him interact with guests, because we knew what could happen when the first-person experience goes awry, in turning off guests or, worse, miseducating them about the past.  And that is perhaps more important today, when candidates and pontificators increasingly attempt to use history as a tool to further their own political or ideological causes.  The people, therefore, need to be forearmed with their own knowledge about the past, if, for no other reason, to use as a shield against those who would attempt to take advantage of ignorance to influence behavior.

And, on a much more personal note, had not James Getty accepted his calling with such alacrity and integrity, Harvard Yard would be home to one fewer historian today, which would be a shame, indeed.

Blurring Rockefeller’s Vision: Is Colonial Williamsburg Risking a Future That Can Learn From the Past?

This was going to be a quick post comparing two historic sites in Keene, New Hampshire, about an important building that has been turned into a Mexican restaurant (the shrimp tacos were terrific) and a well-funded, well-interpreted, and entirely forgettable building just a few blocks away that could tomorrow disappear without a trace and no one would notice.  Moreover, I strenuously try to avoid writing about Colonial Williamsburg (CW) given my intimate association with the place and ongoing relationships with the people who were and are there.  But the CW that I remember, and to which I owe so much, seems increasingly to be a part of the history of heritage tourism, so much is the place changing under its new leadership.  Whether those changes are for good or ill are impossible for me to say.  As Jefferson might put it, those matters might be best left for “the mists of time” to reveal.  I mean, one can roll one’s eyes at the introduction of pirate ghosts in new Halloween programming as wrongheaded, since Blackbeard’s pirates had little to do with Williamsburg and the city’s colonial denizens did not recognize October 31 as anything but the night before November 1 (and the nearby Mariner’s Museum, where it’s mission-appropriate, already does the pirate thing with panache).  But if the details of history are to be shaved a bit to boost short-term ticket sales, then that’s an interpretive choice with which one can politely disagree.  In any case, there’s no avoiding the fact that everything CW does now is worth watching by public and academic historians (I still don’t get the difference between the two) because, like it or not, it has become an object lesson in how to remake, or reimagine, a financially struggling historic site.  And a massive one at that.  So under a gargantuan microscope it goes.

The most important changes, in my opinion, are not really those involving the costumed interpretations on the streets or in the buildings.  As my compatriots here in Boston have learned, I have strong opinions on such things but, in the end, guests will vote with their credit and debit cards on them, regardless of how much folks like me grumble about their content, and then scholars can glean lessons from them.  It’s the decisions that are not so visible and have a greater long-term potential impact that beg for more targeted comment.  I recently wrote about CW’s decision to end its 72-year relationship with the Omohundro Institute for Early American History and Culture, a place fairly described by one of America’s finest historians as “one of the crown jewels of the American historical profession” — news that still baffles me and my colleagues in that profession.  And this morning, in a much less straightforward move, word arrived in my e-mail inbox that, if true, might deal yet another blow to CW’s mission “That the Future May Learn from the Past”: The potential departure of the Vice President in charge of the division of Productions, Publications and Learning Ventures (PPLV), the rumored end to future Electronic Field Trips, and a possible directive that PPLV–the fine folks behind CW’s most valuable education programs–wrap up all existing projects.  Merely the thought of all that leaves one with scarcely a word to say.  Well, almost.

In 1930, the Rev. W. A. R. Goodwin, the leading architect of Williamsburg’s restoration, reflected on the purpose of the place and the reasons for spending so much of John D. Rockefeller, Jr.’s money on it during the Great Depression.  Rockefeller, of course, had long been aware of the importance of preserving America’s past.  One could spend an entire summer visiting wonderful places across the northeast, such as Washington Irving’s Sunnyside, that would have disappeared long ago had it not been for Rockefeller largesse.  But in Williamsburg, the team of Goodwin and Rockefeller had something more active in mind — not preservation, but restoration, and not just the restoration of an entire colonial capital, but that of a state of mind.  Yes, that vision was infused with an almost nauseating, to modern sensibilities, notion of American exceptionalism, and flavored with hefty doses of hero worship for the “great men” of the founding era.  It was intended to infuse guests with a new understanding of patriotism and “a deeper love for their native land.”  But their scope also proved capacious enough to leave room for later CW leaders to also restore, along with the buildings, the experience of those mostly left out of the historical record, whom also built the American nation, women and men, free and enslaved, as part of a fascinating trajectory that one could map along with changes in American society.

Either way, the core of Rockefeller’s original goal was education and, through it, inspiration.  The team of Rockefeller and Goodwin looked to a future in which “Modern windows will open on vistas stretching through the distance into the past.”  And those vistas were opened and expanded as technology advanced, whether to the academic community through the William & Mary Quarterly, to guests through tours and programs based on firm scholarship, to schoolchildren through the terrific Electronic Field Trips, and, most recently, through the innovations of CW’s Digital History Center, which reconstructed the Revolutionary City of 1776, allowing online visitors to “walk” through a nifty Virtual Williamsburg, based on the latest architectural, archaeological, and historical research, without leaving their homes.  Generations of CW leaders and interpreters held on to that original vision of inspiration through education and have allowed ever more expansive audiences to stretch “through the distance into the past” and connect that past to their present.

But, seen in a certain light, that commitment could seem at risk.  The spin from CW will likely be that any moves involving education programs are part of a major and much-needed reorganization, which would be wrapped around just enough truth to be excusable.  And they could add that their commitment to education remains strong.  After all, haven’t they just committed to new online learning programs for teachers?  Furthermore, EFTs were expensive productions and ever tightening public school budgets meant that fewer of them could afford to participate in the program.  In short, EFTs won Emmy awards but lost money, and with belts tightening across CW, like it or not, tough decisions have to be made about what programs can be kept and what can be sacrificed.

The main concern, though, is not the reality but the perception of such changes, coming on the heels of decisions like the separation from the Omohundro Institute.  They raise legitimate questions of whether CW, or any similarly situated institution, has lost its way in favor of improving its bottom line as quickly as possible.  It’s just as likely, as a good friend of mine at CW nicely and accurately put it this morning, that the boatloads of intellect and imagination possessed by the people at PPLV can be harnessed to help guide whatever reorganization results, and leave them with a streamlined, but much more effective, mission.  That’s a worthwhile set of questions for any, even every, historic site.  I consistently argue to sites and the people who run them that fiscal stability, organizational effectiveness, community engagement, and a commitment to a relevant, clearly identified mission are not mutually exclusive.  Or at least they don’t have to be.

In any case, the issues swirling around Colonial Williamsburg could make it the highest rated reality show in public history, something the producers at the Travel Channel should scramble for, given where it is on a new story arc and the plot lines that are developing in front of observers.  Let’s just hope the resolution isn’t that of a declension narrative, especially for Rockefeller’s vision: Inspiration of the present through education about the past strikes me as a pretty good reason to get up in the morning, every morning.

What Colonial Williamsburg’s Charity Navigator Downgrade to 2 Stars Does–and Doesn’t–Mean

Non-profit evaluators such as Charity Navigator and GuideStar provide a terrific service to donors and public historians alike.  Using various metrics, they cut through the weeds of the annual tonnage of reporting documents for places such as historic sites–mainly the all-important IRS Form 990–and evaluate them, using stars or other ratings, in ways that certainly seem clear and unambiguous.  And those evaluations should also be instructive to the leaders of such sites and the people they regularly ask to invest in them.  It’s one-stop shopping for anyone interested in checking out the health of an organization, particularly the ones that meet the income threshold of $1 million, and provides a critical public window into the inner working of a site. So consider Charity Navigator, my evaluator of choice, as a sort of CNBC or Wall Street Journal of the business of public history — they read the numbers so that you don’t have to.

This comes to mind because Charity Navigator just downgraded Colonial Williamsburg (CW), the largest public history site in the world, and the media’s favorite punching bag for anything related to those who make their living from dressing in historical clothing, from three stars to two (out of four).  It makes for a somewhat easy story to tell because it fits into the broadly accepted narrative that CW is in a kind of free fall into an unknown future, just as somewhat similar places seem to have tripped off a financial cliff into the abyss of wedding rentals and ghost tours.  But what has always set CW apart from other apparent cognates has been its substantial endowment, its commitment to its mission, and its willingness to embrace its role as the leader in the public history and museum studies fields, with all the slings and arrows, and tremendous opportunities, that attend it.  Its early and longstanding partnership with the College of William & Mary to create what’s now the Omohundro Institute for Early American History and Culture, which publishes the journal of record for the field of early American history, kept CW on the vanguard of academic history, just as its keen attention to material culture and architecture made it and its talented staff the beacon of expertise for museums and curators everywhere.  Like it or not, what CW did, and how it did it, mattered, especially as a bellwether of historical studies and heritage tourism, which, in my opinion, holds the single largest untapped economic potential in the entire tourism sector.

Consequently, the downgrade to two stars has generated not a little mumbling within the field that CW has finally begun an inexorable march towards its nadir–and is getting a sort of deserved comeuppance after decades of, to some observers, arrogant expansion into hospitality and investment in non-traditional innovative programming, such as the “Actor Interpreter”-driven immersive experience Revolutionary City.  That was part of a subtle but important shift in its mission “That the Future May Learn From the Past” and the core of its interpretation, from a vague colonial period, forever on the verge of the American Revolution, to “A Center for History and Citizenship” that jumped right into the founding era and a focus on the American Revolution itself and, most importantly, the ideas it continues to represent.  Out went the virtually unknown Robert Carter Nicholas and in came 25-year-old James Madison.  And now, as the members of the team that ushered in those changes have either departed willingly or more recently been ushered out, and with relatively new, untested leadership, with little to no experience in the field, in place, the downgrade is reflecting perception: CW is finally on its way out.  Charity Navigator says so.

Here’s the problem: That’s not what the downgrade actually means.  First, keep in mind that it’s an evaluation of a latest available year’s reporting documents, which aren’t usually filed until well into the following year, so it reflects a sort of financial vapor trail of 2013, rather than a snapshot of what’s going on in 2015.  Second, CW has been in this position before.  Lots of times.  Although it hasn’t had a four-star rating since 2003, CW has wavered between two and three stars ever since, earning a two-star rating seven times in the last 12 years.  So the lower rating itself isn’t news.

But here’s the bigger problem: What the numbers actually reveal.  You know, I’ve pounded the importance of deeply reading 990s, like Ted Kennedy trying to move a decent health care bill through the Senate, into every Board of Trustees, public history class, and museum director that I’ve ever been asked to counsel.  And that means reading between the lines of existing filings, comparing them to similarly situated institutions (which one should define as strictly as possible — CW is not Busch Gardens), and looking at them in historical context with past reports.  That’s where a southern friend of mine would say that things at CW get “hinky.”  In the past, CW’s ratings have been dragged down by steadily decreasing financial numbers.  That doesn’t exactly mean pure revenue, because those numbers can be fudged, if one looks closely enough.  Many institutions try to mask a high burn rate (the ratio of actual spending to budgeted amounts over the course of a fiscal year, usually reviewed monthly by an organization’s Finance Committee), and I’ve seen more than a few give it a shot, by drawing down on the principal of an endowment or, more positively, simply benefiting from an improved market, which shows up on a different 990 line (one site that I counseled even kept several million dollars in phantom collections assets on the books until I, um, strenuously explained the error).  Either way, CW’s financial rating is the lowest in its history, at 73.22 (its previous low was 76.33 in 2006). Looking at the raw data that drives the ratings makes matters worse, as Program Revenues at CW were, for 2013, at an all-time low of $38,180,000, the overall number of which was boosted by a slight tick in donations and a substantial boost from investment income.  So the rating actually could have been much, much worse.

But what does all that mean for CW today and, more importantly, for the health of heritage tourism and the livelihoods that depend on them?  One good friend of mine, who has 30-plus years of experience with CW, the Omohundro Institute, and academic history in Virginia and New England (he lives near me in Boston now), thinks that CW’s decline is representative of a broader shift, that the era of such historic sites everywhere, especially places like CW, Old Sturbridge Village, and Historic Deerfield, with such high overheads, is, frankly, over.  Post-World War II tourism has changed and segmented; heritage tourists aren’t interested in what are increasingly seen, especially when programming departs from missions, as precisely what they have long been criticized and ridiculed, incorrectly in the past, for being: Historical theme parks, with a costumed historical figure bearing no functional difference to a sweltering intern dressed up as a cartoon character, and the cringe-worthy “Try to Nail the British Soldier Cut-Out with a Rubber ‘Tommyhawk'” more or less the same as a game of “Whack-a-Mole.”  To my friend, such sites will only survive if they base their futures on their own pasts, as places that represent a more recent, nostalgic history.  After all, what one sees at CW–as it stands now–never actually looked like it does at any point in its pre-restoration history.  Its buildings, landscape, and collections represent a conglomeration of different times and tastes from across the Chesapeake, a trend that has continued with the influence of particular major donors.  The “year in history” approach in the era of colonial interpretation, in which CW took a calendar year and portrayed it for guests in real time (which I thought was pretty nifty, and kept the visitor experience fresh), and then the Revolutionary City master narrative, attempted to connect the disparate elements into a coherent guest experience, with varying degrees of success.  For example, in 1775, almost every house was painted white, the main thoroughfare was an often muddy, dirt road, and what’s been recreated as “Palace Green” was actually a wide (and, yes, muddy) boulevard.  But hardly anyone would pay for that sort of authentic experience.

Those are not criticisms but facts, so, as my friend suggests, CW and other sites might have a future by embracing that character, and the power of the memories of the people who positively experienced it in their own pasts, rather than attempting to compete for a kind of tourist–one purely out for recreation, rather than heritage–that evidence strongly suggests it will never sufficiently draw.  Smaller, and therefore potentially more nimble, historic sites that can collaborate with other sites and experiment with audiences and programming might actually be better situated to take advantage of the modern possibilities presented by heritage tourism.

I confess a great deal of sympathy for that perspective, even if I don’t yet wholly embrace it (except for the point about smaller sites, with which I’m totally on board).  CW’s current leadership is, I strongly suspect, incisive enough to recognize that its program revenue decline is precipitous and might well be permanent.  Without a drastic reorganization, that includes the shedding of almost all the hotels and restaurants (the Inn and Lodge are both splendid, and an evening in one of the Colonial Houses can be almost magical), which are a major drag on the CW budget given the tight connection between the for-profit and non-profit sides of what we collectively think of as CW, that ship cannot even begin to be righted.  But the CW brand remains strong, with quality interpreters, and the potential for generating revenue while shaping the historical understanding of a new generation of Americans in a mission-appropriate way has not yet been sacrificed (the Historic Trades, for example, remains a shining gem in CW’s interpretive crown, so to speak).  However, recent efforts to increase visitation while cutting costs on the non-profit side are not just worrying, they’re alarming, so distinctly do they smack of the sort of short-term, monthly profit-loss report decision-making that comes from, it must be said, inexperience in the field, and that will doom a public history site of any size.  A proper historical foundation for its programming, for example, appears to have gone straight out the window.  A new gecko-type “mascot” for CW–a type of dog that George Washington did not own until well after the Revolution and was never, in fact, in Williamsburg (Washington was no Charles Lee with his foxhounds, endlessly trailing after him through the Governor’s Palace in 1776)–is nothing when compared with the flabbergasting message sent to the academic community by the July 1, 2015, announcement that CW has ended its partnership with the College of William & Mary and its support for the Omohundro Institute.  Just as the creation of the Institute firmly established CW’s commitment to historical integrity, its severance declared that era to be over.  And once historical integrity is lost by an institution ostensibly based on it, then all else might be lost as well.

I don’t mean to offer this post as a eulogy to a place and group of people for whom I maintain considerable fondness.  But recent programming and other decisions are not suggestive of a sustainable, mission-oriented future on which donors can rely in terms of a sound return on their investment in public history and civic education.  Consequently, the Charity Navigator downgrade of CW does not, as Thomas Jefferson might say, signal the death knell of a storied and cherished institution, but the numbers behind it are certainly a fire bell in the night.

Beyond Obama’s Call for “An Honest Accounting”: A Warning from John F. Kennedy and Arthur Schlesinger, Jr. of the Danger of Historical Myths

America’s historical mythology is much in the news these days, causing even President Obama to call for “an honest accounting of America’s history” as a means of healing painful and persistent wounds.  But our flawed national narrative did not begin in 1861, however, or even in 1787.  It goes back even farther than that, to the very beginning, to 1775 and 1776.  I was reminded of that, the intellectual basis of much of my own work on the American Revolution, when I recently visited the John F. Kennedy Presidential Library & Museum.  The beginning of the introductory film to the museum set a noble and important challenge for historians and citizens alike.  With audio drawn from a speech delivered by President Kennedy at Yale University’s commencement ceremony on June 11, 1962, which was drafted by distinguished historian Arthur Schlesinger, Jr., a member of the White House staff, it humorously played with the rivalry between Yale and Harvard before launching into the much more serious business of charging Yale graduates that day to disenthrall themselves from the powerful myths of our past, so they can confront the present with a new sense of reality and, therefore, a stronger idea of our shared strength, because we have honestly acknowledged our common weaknesses.

For many reasons, such as the terrible recent events that have dominated our collective consciousness, and because in my academic work I strive, foremost, to disentangle myth from reality in the moment of America’s founding — and, perhaps, also because, 50 years later, I have followed Schlesinger as a Fellow in the History Department at Harvard, and therefore feel something of an obligation to his own memory as a historian who keenly understood our professional commitment to public service — these words struck me and remain both a powerful reminder and a stern warning.

Too often we hold fast to the clichés of our forebears. We subject all facts to a prefabricated set of interpretations. We enjoy the comfort of opinion without the discomfort of thought. Mythology distracts us everywhere. For the great enemy of the truth is very often not the lie: deliberate, contrived, and dishonest. But the myth: persistent, persuasive, and unrealistic.

Many thanks to the John F. Kennedy Library & Museum for access to the original draft and notes to this speech.

“Soften This Business”: New York and the Loyalist Legacy at Waterloo

That the loyalists of the American Revolution helped shape the British Empire after independence is no secret.  In particular, Maya Jasanoff highlighted the importance of the “loyalist diaspora” to the development of Canada and other parts of the British world in her terrific Liberty’s Exiles: American Loyalists in the Revolutionary World (2011).  There were also those figures who I call “legacies” of the Revolutionary loyalists, the sons and daughters of those who, but for their loyalty to the British constitution, would have finished their days in New York or Virginia or Massachusetts and therefore never have had to opportunity to act on the stage of greater British history.  As we today remember the Battle of Waterloo, the ultimate conflict of the Napoleonic Wars, an era much more transformative and costly to Britain than anything that happened on this side of the Atlantic, and much more important to early American history than is generally considered, my thoughts inevitably turn to those loyalist legacies that, if one looks closely enough, can be found almost anywhere.  And, true to form, one needs look no further than the entourage of Wellington himself 200 years ago today for a direct connection between America–in this case, New York–and a battlefield in Belgium.

Fairly well known is Sir William Howe DeLancey, who, as Wellington’s chief of staff, was the senior British casualty of the battle.  Less well known is the officer who was by his side that day, and attended DeLancey when he was first struck by enemy fire: his first cousin, DeLancey Barclay.  Like Delancey, who was born in New York City in 1778, Barclay was born in British New York to loyalist parents, on 16 June 1780 in Hell’s Gate, Long Island.  His father, Thomas Henry Barclay, was the brother of Cornelia DeLancey, William Howe DeLancey’s mother.  A former lawyer who had studied with John Jay and was related to him by marriage, Thomas Henry was then an officer in the Loyal American Regiment, commanded by his brother-in-law, the erstwhile Virginian Beverley Robinson.  Barclay, it seems, was related to a Who’s Who of American revolutionary notables, from Jay to Robert Livingston to Lord Stirling to Benjamin Franklin.  But as a result of his father’s loyalty, DeLancey Barclay started off life in British occupied New York City and then, after the British evacuation in 1783, lived in Annapolis, Nova Scotia, before possibly returning to New York City when his father was appointed British consul-general for the Eastern United States in 1799.  Not long after, however, on 11 January 1800, DeLancey Barclay followed his cousin, by then Sir William, into the British army.

For the next 15 years, Barclay steadily moved up the ranks from Cornet in the 17th Dragoons to, in 1814, Captain and Lieutenant-Colonel of the 1st Foot Guards.  On the day of the battle with Napoleon, 18 June 1815, New York must have seemed a distant memory–if a memory there was at all–to the American cousins riding behind Wellington.  According to the contemporary account of Sir William’s widow, Lady Magdalen Hall DeLancey, the battle “began about eleven; near three, when Sir William was riding beside the Duke [of Wellington], a cannon ball struck him on the back, at the right shoulder, and knocked him off his horse to several yards distance.”  The Duke, having, of course, other things to attend to (like saving the free world), paused to bid goodbye to his friend and then rode on to attend to the fighting.  But Sir William was not left alone.  His cousin, Colonel DeLancey Barclay:

“…who had seen him fall, went to him instantly, and tried to prevail upon him to be removed to the rear, as he was in imminent danger of being crushed by the artillery, which was fast approaching the spot; and also there was danger of his falling into the hands of the enemy. He entreated to be left on the ground, and said it was impossible he could live; that they might be of more use to others, and he only begged to remain on the field. But as he spoke with ease, and Colonel Barclay saw that the ball had not entered, he insisted on moving him, and he took the opinion of a surgeon, who thought he might live, and got some soldiers to carry him in a blanket to a barn at the side of the road, a little to the rear.  The wound was dressed, and then Colonel Barclay had to return to the Division; but first he gave orders to have Sir William moved to the village; for that barn was in danger of being taken possession of by the enemy. Before Colonel Barclay went, Sir William begged him to come quite close to him, and continued to give him messages for me. Nothing else seemed to occupy his mind. He desired him to write to me at Antwerp; to say everything kind, and to endeavour to soften this business, and to break it to me as gently as he could. He then said he might move him, as if he fancied it was to be his last effort. He was carried to the village of Waterloo, and left in a cottage….”

Sir William died a week later, attended by his wife to the end.  The Episcopal Church of All Saints, Waterloo, now stands roughly on the site of the cottage to which DeLancey Barclay had him moved.  He was buried in Belgium and now lies with the other casualties of the battle.

But what of his cousin?  Barclay survived the battle and rose in his profession, as well as in his country’s esteem.  He married, purchased a country house in southern England, became aide-de-camp to the Duke of York, was brevetted Colonel of the 1st Foot Guards, and then advanced with the Duke when he became King George IV, who made Barclay a Companion of the Bath and aide-de-camp to the British monarch.  Barclay would not, however, enjoy such lofty emoluments for long, as he died in England in 1826.  His father, then still living in New York as a British diplomat, wrote upon hearing the news:

“On the 29 March, 1826, departed this life our beloved son DeLancey Barclay, after an illness of three days.  In addition to his being an amiable, correct man and a dutiful, affectionate son and husband, he was one of the most promising and rapidly advancing men in the British Army. … His death was universally lamented. To his aged parents and friends his loss is irreparable.”

That loss was felt in many places in New York, including a small hamlet on the Hudson River — Saugerties — where DeLancey’s brother, Henry Barclay, helped build a community, quite literally, in the new United States.  The product of the same loyalist context as his brother, Henry chose an American life, grasping the opportunities offered by the industrial revolution to build mills, homes, and several churches, including Trinity Episcopal Church, for himself, his family, and friends, and St. Mary of the Snow Roman Catholic Church, for his mainly Irish immigrant workers.  He also helped establish the village itself and served as its first president — a rather different loyalist legacy than that of his brother, DeLancey, a hero of Waterloo.

Words, Words, Words: A Capital Tool to Learn and Use Colonial American Language

When I usually take to this blog, at least of late, it’s either to share an interesting historical tidbit or cheerlead for the Save Sweet Briar Campaign.  I do, after all, have a book to write.  Today’s post is therefore a first for me: A book endorsement.  I can hardly call it a review, as I’m in no way dispassionate or nonpartisan about the work, so enthusiastically do I endorse Dr. Joan Bines’ Words They Lived By: Colonial New England Speech, Then and Now (2013).  A fellow University of Virginia alumnus, Dr. Bines has been deep in the trenches of public history for some time as the director of a terrific and important site in Weston, Massachusetts — The Golden Ball Tavern, an 18th-century inn that is the only place I know that tells the story of the loyalists in the American Revolution and tells it well.

With her experience as an educator of students of all ages, her infectious love of language, and a keen talent for concise, even charming, description, Dr. Bines has provided a clear answer to a question with which I wrestled when I was chief historian at Colonial Williamsburg, dealing with a legion of first-person interpreters and other guides: What did men and women sound like in the 18th century?  Thankfully, there is no shortage of literary evidence from the period, but there is a somewhat frustrating paucity of sources that tell us much of anything about common speech in colonial America.  Sure, we can pull out a play by Robert Munford or a sermon by George Whitfield and refer to their vocabulary and sentence structure, but they cannot be considered representative by any stretch of the imagination.  I encouraged CW’s interpreters to read what I encourage my students to read — as many 18th-century Anglo-American sources as possible, especially the sort that were geared towards broader audiences, such as newspapers, novels, and pamphlets.  But I always hoped for a secondary source, written informatively and engagingly and with proper scholarly apparatus (e.g., accessible footnotes!) that could provide interpreters, guides, and, frankly, anyone interested in the daily lives of colonial Americans, with a firm foundation on which to build their understanding of the men and women of that time.

So imagine my surprise and pleasure in happening upon Dr. Bines’ book on my recent return to New England, which accomplishes all that I wished for CW’s interpreters and guests (the ultimate beneficiaries).  Do not let the title fool you: Although it nominally focuses on New England, and its primary sources are mainly derived from this place and its people, it is tremendously useful regardless of one’s region of interest, so nicely does it explore and explain basic assumptions of colonial American speech, including syntax, vocabulary, and the slipperiness of idiom.  As I implied above, it is based on the right sources and limited to a manageable chronological period so as to be reliably representative.  Of course, it does not reflect the speech patterns of enslaved men and women — for that, we must keep on searching — but it also does not presume to do so.  It does, however, draw so many of its sources from women, that gender is not really an issue.  Moreover, Dr. Bines has organized it quite well, in terms of its chapters, aptly illustrated it with her own photography, and included a quite helpful bibliography and index.

Is the book the be-all and end-all of the subject?  Of course not.  But it doesn’t pretend to be.  Dr. Bines’ work is, however, the best foundation I’ve encountered upon which to build one’s practical understanding of colonial American speech.  To even approach a vernacular understanding of the period, I’d combine it with Mary Miley Theobald’s Death By Petticoat: American History Myths Debunked, published a few years ago by CW (or just follow her blog), and then build on that with a heavy dose of Nathan Bailey’s Dictionarium Britannicum (go with the 2nd edition of 1736, which was found in more colonial libraries than any other English dictionary, including Johnson’s) and a few servings of the first edition of the Encyclopedia Britannica.  But do start with Dr. Bines’ splendid book.  By doing so, I guarantee that you’ll not only learn much about the 18th-century world, but also a great deal about our own.

Board to Death: A Mild Reflection on the Quiet Threat to Cultural Institutions

In 1971, the cartoonist Walt Kelly, playing on Oliver Hazard Perry’s famous War of 1812 words, had his Pogo proclaim, “We have met the enemy and he is us.”  And so it goes for cultural institutions these days, such as historic sites and universities.  Governing boards, whether known as “Trustees” or “Directors” or “Visitors” or another name, are supposed to be the safeguards of the legacies of such places and the defenders of those who preserve and promote them.  But more often than not, in my experience, they do more harm than good, so poorly do board members understand their actual roles and responsibilities.  Consequently, presidents or senior directors of cultural resources often find themselves in conflict with uninformed, even uninterested, board members and spend more time managing them—in attempting to get them to work for, rather than against, the welfare of the institution—than almost anything else.  And that’s if the director happens to be someone who wants to do more than just keep his or her job, which generally means silently suffering through a problem board’s latest wrongheaded administrative, financial, interpretative, or [insert almost any other category here] whim, so long as paychecks continue to clear.

I’m not writing of matters of minor import.  Take a look at what’s happening right now at Sweet Briar College, for example.  Its board hired a president who, within a year, announced that the school would close at the end of the current term because of supposedly insurmountable financial difficulties.  Rarely has the ignorance, if not outright malevolence, of a board in failing to fulfill its proper charge been so clearly and publicly displayed, as the president could not have proceeded, or perhaps even been hired, without the board members acting as buzzkills-in-chief in the prospective closure.  So, just like that, with breathtaking temerity, a splendid college that has done a terrific job of educating young women in Virginia for more than a century was sentenced to death, seemingly without appeal.  Of course, such a myopic and dramatic step reveals one of the more pernicious characteristics of problem boards: The inherent belief, especially amongst long-term board members, that l’institution c’est moi.  The notion that they are the institution runs strong in such boards, allowing them to feed their egos as easily as they disregard their responsibilities and ignore the basic fact that the best of places, like Sweet Briar, do have the right of appeal in the form of other entrenched, committed, and powerful stakeholders, whether they be students, staff, faculty, alumni, donors, or members, whom can and will take matters into their own hands.  The Save Sweet Briar campaign will, I trust, become a cautionary tale for problem boards everywhere. (Note: I’ve financially contributed to #SaveSweetBriar and encourage anyone who cares about higher education for women to do so, too.  Click here to make a pledge.)

Lest you think that Sweet Briar’s case is exceptional, turn your attention to other examples, such  as the University of Virginia’s board’s failed attempt to oust president Theresa Sullivan several years ago or, much more recently, to upstate New York, where another nifty little school, the College of St. Rose in Albany, is suffering through a death by at least 40, if not 1000, cuts — different in degree from Sweet Briar’s troubles, perhaps, but not in kind.  Just this week, a college president in the job for less than a year announced a major retrenchment because of the institution’s financial difficulties.  Sound familiar?  At St. Rose, 40 jobs are to be eliminated and health care coverage sliced for all employees.  The president claims that the steps, taken with the board’s nemine contradicente support, are necessary for financial reasons caused by the fact that, in a striking admission, “we have just not been attentive.”  Um, to employ the punchline of an old joke about the Lone Ranger, “What do you mean we, Kemosabe?”  If we take someone whom has been on the job only 11 months at her word, that means that the board, until recently, remained entirely ignorant of the budgetary difficulties facing the school and the new president was hired either without knowing the extent of the situation (a possibility) or with an express mandate to make cuts that a suddenly attentive board thinks are required (a probability).  It appears that, in attempting to compete in the crowded market of higher education, St. Rose quickly expanded (one presumes, also with the full support of the same board), but did so in a way that was not sustainable, necessitating the subsequent budget cuts.  Either way, as with Sweet Briar, the failure lies not with the administrator, but rests at the collective feet of a problem board, ignorant of its proper responsibilities.

The issue is not limited to colleges, of course.  Historical and other cultural organizations are hardly immune to the disease.  The famous example of the Barnes Foundation disaster should keep any cultural administrator on his or her toes.  Keep in mind that, for many current and prospective board members, being on such boards carries with it social cachet in certain circles and is therefore coveted for that very reason.  Forget that good board governance actually requires members who understand the real work that a board exists to do, from fundraising (including writing their own checks) to strategic planning, and contains more than a handful of people who are willing to approach the commitment in a thoughtful and productive way.  Those board members who see their position as valuable primarily for social and professional networking advantage represent, sadly, many of those of my acquaintance at some of the most — and more than a few of the least — prominent cultural institutions in the country.  They show up to quarterly meetings, are invariably treated like visiting royalty, and just as invariably kept as far away from the reality of the institution as they care to be, while often fed carefully crafted presentations of the institution’s situation, along with the canapés and Chardonnay.  Rarely do they truly engage with the front-line staff.  I have been part of more than one cultural organization, in fact, the boards of which had never even met the senior staff, let alone those carrying out the organization’s daily mission with guests or students.  The result is a sort of latchkey institution, left alone, without oversight or guidance from those whom are legally responsible for maintaining its legacy.  And we wonder why they struggle, and struggle, and struggle, and then fail?  There are exceptions to this rule, of course, such as the teams behind James Madison’s Montpelier, the Valentine, the International Tennis Hall of Fame, the Newport Restoration Foundation, and the White House Historical Association, and also among individuals, who inevitably bear the greatest burden of board committee work, but those exceptions strongly appear to prove the rule.

The main point I want to make in this not-so-thinly veiled rant is for cultural administrators of any sort, from public historians to museum directors to faculty members: Beware of the board. And get to know all you can about the ins and outs of good board governance.  If you are in a position to evaluate a cultural institution, start with the very board to whom you do or might report.  How informed are they about their roles?  How is the board organized?  What is its members’ understanding of the institution’s mission statement and strategic direction?  How much do they know about the organization’s financial position?  And, almost above all, are they willing to change if change is demonstrably needed?  Then ask yourself whether it is therefore a problem board, one that will do more harm than good in the long run, unless you first go through a rigorous process of board education (which is no fun, trust me) or, in more drastic but very real circumstances, you look at having it completely dissolved and then reconstituted along more practical and responsible lines.

In the end, as stories about failing institutions zip through the media, and senior administrators and program directors come under censure for their individual shortcomings, look closely at what you don’t see, because they hide in plain site.  Look at the people who actually hold the responsibility for protecting an institution’s best interests—which usually means to do whatever it takes to keep its legacy alive.  Look at the boards, for what your organization does not know about them could kill it.

“Now or Never Our Deliverance Must Come”: George Washington and the Power of Contingency

“Contingency” is a historical concept easy enough to understand but among the toughest to convey, whether to students in a classroom or guests at a historic site.  Essentially it means that things in the past did not have to turn out the way that they did.  Just one different choice could have set our entire historical experience in a different direction.  Put another way, as Stephen Jay Gould suggested in a related context, if we rewound the tape of history back to a certain point and pressed “play,” would events have turned out the same?

Exploring such counterfactuals can be enormously useful–and intellectually challenging–as an educational tool.  For example, what if we rewound our notional historical tape back to this date in 1781?  What would we find?  Were you in New York, at the New Windsor Cantonment (which you can visit today), you would have found a rather somber 49-year-old Virginian named George Washington.  The year had not been kind to his Continental Army, and the War for Independence was looking far from won.  In fact, precisely the opposite was the case.  The American economy had collapsed; the British government was restored in Georgia; and most of South Carolina, part of North Carolina, the Chesapeake Bay, and much of Virginia were already back under British authority (so confident of victory was Lord North’s ministry that Virginia’s former royal government, including Governor John Murray, Earl of Dunmore, was ordered back to the Chesapeake to resume its responsibilities).

And things were far more bleak on Washington’s immediate front.  The French commander, the Comte de Rochambeau, held such disdain for Washington’s troops that he wanted to avoid fighting alongside them at almost any cost.  Overall French patience with the conduct of the war, especially after they failed to wrest the Caribbean islands from British control, was rapidly disappearing.  As for Washington’s army, one might not blame Rochambeau for such a bleak view.  The mutiny of the entire Pennsylvania line, then the New Jersey line (which ended with executions), and then the threatened mutiny of Massachusetts’ sergeants all brought Washington’s own confidence to perhaps its lowest ebb.  That’s why on this date in 1781, you would have found him at his writing desk at New Windsor, worrying about the future in a letter to John Laurens.  “We are at the end of our tether,” Washington lamented, “and that now or never our deliverance must come.”

He could not know that seven months later he would be back in Virginia, watching the surrender of the second-largest British army in the field in a defeat of such magnitude that it would bring down the British government, end the war, and formally usher the United States of America onto the world stage.  But it is critical for historians–especially public historians–to be able to grasp and then convey the importance of such moments to others as a potent reminder that nothing was or, like the current threatened closure of Sweet Briar College, is inevitable.  Historians should leave to authors such as Jane Austen the power to determine that things happened “exactly at the time when it was quite natural that it should be so,” for in real life, then and now, what might later seem natural, at the time might appear quite extraordinary.  And those are powerful enough moments to dwell on.

Where’s Felicity?: What’s Not in a Name in Colonial Virginia

Every so often, a historian gets an intellectual itch that needs to be scratched, an irksome question that just won’t go away, so off to the side goes the current project until something of an answer might be found to temporarily satisfy the curiosity.  And so it happened to me yesterday, after writing a blog post about #SaveSweetBriar.  In it, I referenced Felicity Merriman, the principal character of one of the most beloved American Girl books, created by my friend, Valerie Tripp, about a young woman who lived in Virginia during the age of the American Revolution.  Having a keen scholarly interest in the way that young people develop passions for the past (mine started with The Mystery of the Old Musket), I’m especially curious about the role that Young Adult and related fiction plays in that process and have been rather surprised by the number of adult women, whom are now themselves historians, interpreters, archivists, and at least one award-winning journalist (my wife), who can trace their interest directly to Valerie’s Felicity. (I’ll share a guilty secret and reveal that I own and have read all of the books, including “Felicity’s Mysteries,” and find them utterly charming.)

Except there is a problem.  As a historian of colonial Virginia, I could not recall having once come across anyone named Felicity.  Not one.  In fact, I couldn’t remember having ever seen any name like it in correspondence or ledgers or on a gravestone or anywhere else.  Given names like that are common enough in New England records and graveyards, deriving as they do, I believe, primarily from a sort of Puritanism fueled by hefty doses of John Bunyan’s The Pilgrim’s Progress, which resulted in such permanent reminders of the virtues that one should pursue to lead a good Christian life.  Consequently, names like Charity, Prudence, Mercy, and, yes, Felicity, can be found throughout 17th- and 18th-century New England.  But I could not think of one in colonial Virginia.

So, knowing that my loyalists will wait, I decided to take a look.  I just happen to have built up, over the last ten years, a sizable prosopographical database (using Zotero) that focuses on the men and women, free and enslaved, who lived in and around the Chesapeake between 1600 and about 1820 (that later chronological edge keeps slipping further and further into the 1800s).  Having already completed a broader study on male (free and enslaved) naming practices in the Chesapeake (to map birth legitimacy traits), I already had a model ready into which, in my search for Felicity, I could plug the female names and see what came out the other end, in a somewhat scientific analysis.

And I couldn’t find Felicity.  Or Charity.  Or Prudence.  In fact, the only, for lack of a better way to describe them, “New England” name in our entire household belongs to our dog, Mercy Otis.  Of 483 free females in my database whom had anything to do with the Chesapeake in the 17th and 18th centuries, only one — Fortune Randolph (a first cousin of Thomas Jefferson’s) — had a given name even close, and she had to be excluded in the end because she actually lived in Bristol, England, and never saw America.  Interestingly enough, a full 45 percent of them were named Elizabeth (which is actually the name of Felicity’s best friend in the books and comes from a family of good loyalists), followed closely by Mary (33 percent).  The next closest was a version of Anne (including Ann and Anna) at 12 percent.  The remainder are a wide variety of Susannahs, Lucys, Sarahs, Janes, etc.

Of course, the next question occurring to any historian is “why?”  Why such a tremendous difference in naming practices and, honestly, why so many named Elizabeth and Mary (more than three-quarters of the total)?  The easy and quick answer is also the most interesting: it’s a fantastic illustration of the striking cultural differences between the different parts of colonial America.  Religion is the most obvious difference, as my sample was comprised almost entirely of Anglicans, rather than Puritans or their Congregational successors.  There are other issues to go into–the use of diminutives, for example, such as Betsy and Molly, which seems to me to be a class matter–but they will have to wait.

As my loyalists are beckoning me, I’ll leave you with my findings, in descending order of occurrence.  Ladies and gentlemen, as Train would sing, meet (colonial) Virginia:

Elizabeth, Mary, Ann/Anne/Anna, Lucy, Martha, Susanna/Susannah, Jane, Sarah, Frances, Alice, Rebecca, Hannah, Maria, Margaret, Isabella, Charlotte, Dorothy/Dorothea, Ariana, Winifred, Judith, Catherine, Ursula, Priscilla, Ellen, Joanna, Christina, Agatha, Clara, Letitia, Edith, Amy, Eleanor, Lydia, and Pauline.       

Where There’s a Will, There’s No Way?: Saving Sweet Briar

Sweet Briar College has been recently likened to a setting from an American Girl book, which is true enough, I suppose.  It is, indeed, a beautiful and serene space, almost idyllic and out of time, one full of young women (and their horses) whom, by trying to better themselves, hope to better their world.  So it’s more or less the school that Felicity Merriman, and her Penny, would attend in a heartbeat were they alive (and real) in 2015.  And that’s a terrific thing; Sweet Briar is a safe place where girls are allowed to become women, and prepared to meet a world more contentious than they deserve.  And I can attest to that personally, as a former member of the University of Virginia faculty (and a UVa alumnus), I taught Sweet Briar students during summer sessions, and, as a historian and equestrian, have been to the college a number of times, and found the members of its community as bright, earnest, and engaging as any I’ve encountered at Brown or Harvard.  Not to slight Wellesley or Smith, both splendid places, but I think that Sweet Briar was made as much for the fathers of today’s Felicitys as it was for the modern Felicitys themselves.

That’s why, as someone quite familiar with the college, I was especially distressed to learn that it is to be shuttered like an English country house at the end of a shooting season when the current term comes to a close due to “insurmountable financial challenges,” but without the ceremony, one suspects.  Goodbye students, faculty, and staff (and equine residents)—the doors will be locked and keys likely turned over to lawyers and accountants come June by a vote of the Board of Directors.  To note that the decision appears abrupt is to muddy the picture.  Without a doubt, the school has been in serious financial trouble for some time.  The enrollment has dipped and the terms of the Will that created it, and current unrestricted endowment funds, do not lend its administrators much painless flexibility.  The writing was, as they say, on the wall, when the Board chose expedience over courage in washing its collective hands of the place, its 114-year history, its tens of thousands of alumnae—and the Directors’ responsibility to them all.

Other observers have reflected on this or that aspect of the story, as if it is already a matter for reflection, rather than for action.  Is it a harbinger of the fate of small liberal arts colleges across America?  What will the soon-to-be unemployed faculty and staff do in an already crowded academic job market?  And what about the horses?  Frankly, I’m not terribly concerned about any of that, except for perhaps the horses.  Colleges everywhere are in the midst of dramatic change to curricula and methods and staffing in order to remain both solvent and effective as institutions of higher learning, embracing, for example, digital and distance learning.  Even the seemingly most secure universities, such as my current institution, Harvard, are acting in bold and creative ways to expand and ensure their reach.  And Sweet Briar’s faculty might be considered as paid lower than their counterparts at other institutions, but with light class loads, little to no publishing requirements, a relatively low cost of living in the region, and an average salary still higher than the median national household income, I don’t think too many people, inside or outside of the “Ivory Tower,” are going fret much about their fates.

That brings us to the most important part of the Sweet Briar situation: the students—past, current, and, hopefully, future.  Having worked with a number of Boards of Trustees and Directors of different sorts, in my experience they are often an institution’s worst, but most pampered, enemies.  More often than not, they tend to know very little about their charges (other than what senior administrative staff allow them to know) and, an even more insidious factor, have no real, personal stake in whether their involvement results in success or failure.  In Sweet Briar’s case, the conclusion is painfully obvious, regardless of the names on the letterhead: this Board—not the faculty or staff or, most certainly, students—failed to discharge its responsibility, yet can simply walk away.  The students and staff are not so lucky.  Instead of immediately closing the doors of the school to, according to the Directors, save the students from a future of consistently lowering standards and a diminished experience, they should instead have admitted their own, personal defeat, resigned en masse, and turned over the leadership of the institution to a set of people with the vision, creativity, and willingness to implement the sort of change, however painful it might be in many quarters, that Sweet Briar requires–the college community itself, which includes the people of the surrounding area.

As my grandmother might have said, the President and Board came to the conclusion that the only way to open this nut was to use a hammer.  And what’s worse, it was done in the name of the students, who don’t seem to have had much say in the matter at all.  Sweet Briar has unique strengths that could be maximized and weaknesses that can be addressed (my grandmother, from whom I learned almost everything good and decent about life, would not have acknowledged that the word “insurmountable” even existed).  Again, other schools are facing similar challenges with nowhere near the same special qualities to help address them (the equestrian program alone is the envy of schools across the East Coast).  For example, more aggressive and creative marketing can target enrollment, and faculty can teach more classes that enroll part-time and non-traditional students, both on-site and online (which sure beats being unemployed).  To continue dispensing my grandmother’s old Chesapeake wisdom—if something is wrong, stop whinging and do something about it.  If that fails, pick yourself up, learn from the failure, and try something else.  But always try.  Where there is a will, there will be a way.  And as any rider whose horse has a big heart knows, no obstacle is entirely insurmountable.

For Sweet Briar, there is an unsurprisingly intense will, at least outside the current Board room, to make sure that its story does not end in 2015.  The #SaveSweetBriar campaign is just beginning and has developed encouraging momentum.  So I suggest that the current President and Directors be thanked for their time and sent on their merry way, leaving the college to those whom actually care about its future—and whom, if they cannot find a way, will almost certainly make one.