30 June 2007

Changing Roles of Katakana (and Italics)

A recent post on Language Hat about the official name of Iwojima changing back to its prewar form, Iōtō, sparked a bit of discussion about the reason for the change to Iwojima in the first place. That prompted me to take another look at Japanese military communications, the changing role of katakana in Japanese writing, and then the changing role of italics in western writing. In each case, current usage misleads us about usage in other times and places. Here is a small collection of corrective lenses on the past.

Origins of italic type and its shrunken role

Italic type originally served very different roles. It wasn't invented just to set off words that were emphasized or foreign.
The humanist spirit driving the Renaissance produced its own unique style of formal writing, known as "cursiva humanistica". This slanted and rapidly written letter, evolved from humanistic minuscule and the remaining Gothic current cursive hands in Italy, served as the model for cursive or italic typefaces....

Surviving examples of 16th century Italian books indicate the bulk of them were printed with italic types. By mid-century the popularity of italic types for sustained text setting began to decline until they were used only for in-line citations, block quotes, preliminary text, emphasis, and abbreviations.
Origins of the kana syllabaries

In A History of Writing in Japan (U. Hawai‘i Press, 2000), Christopher Seeley describes the origins of the kana syllabaries (p. 59).
The two Japanese syllabaries known to us today as hiragana and katakana came into being as the result of a process of simplification to Chinese characters used as phonograms [purely for sound, not meaning]. The phonogram principle was known in early China, where it was sometimes utilised to represent foreign words in writing, as for example Sanskrit names and terms in Chinese translations of the Buddhist sutras. In Japan, too, Chinese characters were employed in this way from an early date, at first only to represent proper nouns, but subsequently in an increasingly extensive manner. This gradual trend towards the wider use of phonograms provided a strong incentive to the development of simplified forms.
Hiragana developed through a process of cursivization—linking, blurring, and eliding separate strokes in order to write whole characters more rapidly (a bit like cursive script and its derivative italic type in Europe). Katakana developed through a process of writing just one key part of a whole character.

Early roles of the two syllabaries

Nowadays, hiragana is the more basic of the two syllabaries, in that respect more akin to roman type; while katakana is used to represent foreign words and names, onomatopoeic sounds, or emphasized words, in those respects more akin to italic type. However, the earliest common usage of katakana was to gloss Chinese characters with their native Japanese translation in kuntenbon, Chinese texts marked for reading as Japanese, dating from around the tenth century. In those glosses, katakana indicated the native Japanese reading (kunyomi), not the foreign reading (onyomi for Sino-Japanese). This style of reading Chinese texts, called kundoku, required the reader to translate each Chinese sentence not just into native Japanese word order, but into native Japanese words, even adding Japanese honorifics. Readers in the ondoku style, by contrast, would render the Chinese text in Chinese order and Sino-Japanese (onyomi) pronunciations.

While monks and learned gentlemen decoded Chinese texts with the aid of katakana glosses, noble court ladies employed the more elegant and flowing hiragana to compose Japanese-style letters, poems, and prose fiction. In fact, cursive hiragana was referred to in those days as onna-de 'women's hand' (the term hiragana is not attested until 1603); while otoko-de 'men's hand' denoted a blockier script heavily dependent on Chinese characters (Seeley, pp. 76-80). This doesn't mean that men never wrote in hiragana, or that women never employed kanji or katakana, only that cursive hiragana was considered more feminine, and blockier kanji and katakana was considered more masculine.

Kata the mechanical kana

As Japan opened up and began industrializing in the mid 1800s, the relative simplicity and efficiency of katakana gained it many new applications, most notably in semaphore, where the flag positions represent the shapes of katakana strokes (requiring 1, 2, or 3 positions per character); and in telegraphy, where Japan's Wabun kana-based Morse code was far more efficient than China's character-based code, even though it requires twice as many dot-dash combinations as Oubun 'European' Morse code. The two superscript dots in Japanese kana that indicate voicing (dakuten) are efficiently coded by an extra dot-dot, but the superscript circle that turned h into p (handakuten) is coded far less efficiently by an extra dot-dot-dash-dash-dot! In both semaphore and telegraphy, the receiver transcribed the message in katakana and telegrams were delivered in katakana.

As a result, military communications were overwhelmingly rendered in katakana. Bill Gordon's very impressive website, Kamikaze Images, even includes a replica of a kamikaze pilot's final letter to his children written almost entirely in katakana. And former RAAF wireless operator A. Jack Brown, who spent World War II transcribing Japanese military broadcasts, even titled his recently published memoir Katakana Man.
Instead of a flying career, Jack found himself in top secret RAAF wireless units. There he worked to intercept radio transmissions sent in the Japanese katakana code, which were then analysed to produce the highly reliable intelligence that helped General MacArthur in devising his strategy for the allied campaign in the South-West Pacific.
(Also see the U.S. Naval War College Review article about American code-breakers in the Pacific.)

In some ways, katakana also played a role similar to that of the Courier typeface that was the official standard for U.S. government and diplomatic documents for decades until 1 February 2004. Government reports were often published in kanji and katakana, rather than kanji and hiragana as would be customary today. So was Japanese imperial propaganda (translated here). Ease of carving also made katakana much more common in official seals and on woodblock prints than it is today.

I suspect the wholesale abandonment after Japan's defeat of so much katakana usage was partly motivated by Japan's attempt to wash away the stains of its military and imperial legacy.

28 June 2007

Korea's Cultural Renaissance, 1920s

At least for Korea's middle-class intellectuals, the early 1920s marked a time of hope and renewed cultural and political activity.... Renaissance is an apt description of the outpouring of essays, commentary, literature, and political analyses that fueled the reemergence of a Korea press after 1920....

The magnitude of the 1920s publishing boom was enormous in relative terms. The Japanese had issued permits for only forty magazines and journals during the entire 1910–1920 period, but in 1920 alone, they granted permits to 409 different magazines and journals, not to mention the coveted "current events" (sisa) permits to two daily newspapers, the East Asia Daily and the Korea Daily (Chosŏn ilbo), and almost half a dozen politically oriented journals. In 1910 the combined circulation of Korean daily papers and important journals probably did not exceed 15,000; by 1929 the circulation of the two Korean newspapers alone had increased tenfold to 103,027. The sisa permit allowed discussion not only of current events, but also of political and social commentary. Moreover, no cumbersome change in the legal system that governed publishing had been necessary. Suddenly permits that for the most part had been denied Koreans for a decade were forthcoming. There was no lag between policy and practice, and given the youth and energy of the new publishers—the founder of the East Asia Daily, Kim Sŏngsu, was only thirty and his reporters were in their twenties—new publications hit the streets weekly in the early years of the 1920s.

In the early 1920s the new publications were poorly financed; there was plenty of patriotic enthusiasm but little business sense. With journals it did not matter; the goal was to get ideas and plans into the open for discussion. Many of the political journals were supported by donations, and they almost always lost money. The newspapers did not make money for several years, but they were sustained by investors' patriotic fervor. By the mid-1920s, however, increasing advertising revenues (ironically from Japanese commercial sources) brought them into the black, and by the early 1930s each was publishing successful entertainment monthlies aimed at segmented audiences such as youth, women, sports fans, and children. Publishing was becoming a profitable business that competed with other enterprises for a share of the expanding market for entertainment. This called forth lamentations from political activists, who decried the commercialization of the press and the corresponding enervation of its political commitment....

Perhaps even more startling than the outpouring of publishing after the Cultural Policy thaw was the mushrooming of organizations of all types. In 1920 there 985 organizations of all types registered with the Colonial Police. These were local youth groups, religious organizations, educational and academic societies, and social clubs. Two years later this number had swelled to almost 6,000. These included occupational groups, tenant and labor associations, savings and purchasing cooperatives, temperance unions, health and recreational clubs, and groups clustered by Japanese statisticians into a vague category called "self-improvement." The Cultural Policy clearly set loose an enormous pent-up demand for associational life in the colony. And while most groups restricted their activities to politically innocuous social, enlightenment, or self-help projects, even a cursory glance at their charters reveals that many linked their goals to national self-strengthening. There were, however, many groups who forsook nationalism altogether in order to promote social reform among Koreans themselves, most notably, early feminist groups and the movement to eliminate discrimination against the traditionally low-status paekchŏng [comparable to Japan's outcaste burakumin]. In the short term the Japanese chose to ignore the potential for nationalist mischief that these organizations represented, but they were very keen to monitor and selectively suppress what they saw as class-based—and therefore more dangerous—tenant and labor organizations....

Another important feature of the organizational boom was the increasing participation of women in public life. Women's clubs and educational associations had appeared on the heels of the Independence Club's activities in the late 1890s. Thereafter aristocratic and middle-class women took the lead to establish schools for women and to reform oppressive customs such as child-marriages and the prohibition of widow remarriage (some of these customs had been outlawed already by the Kabo social legislation of 1894–1895). Before annexation, women in the Christian churches had formed groups around a number of social reform issues. Soon the number of patriotic women's associations (aeguk puinhoe) burgeoned, and they played an important role in the largest private campaign mounted in Korea before annexation—the National Debt Repayment Movement. After March First [1919] the term "new woman" (sinyŏsŏng) became standard usage in the press to describe modern, educated women who had become a very visible part of public life. By the 1920s more radical demands for a true liberation of women emerged in Korea's first avowedly feminist journals, Kim Wŏnju's New Woman (Sin yŏsŏng) and Na Hyesŏk's Women's World (Yŏjagye). In these publications women's issues were not justified by merging them with the agenda of national self-strengthening. Instead, for the first time, Na and Kim directly confronted the inequity and oppression of Korean patriarchy. Radical feminism, however, was ultimately marginalized, while the less confrontational agenda of Christian-dominated, reformist women's groups found favor within the male-led nationalist movement.
SOURCE: Korea's Twentieth-Century Odyssey: A Short History, by Michael E. Robinson (U. Hawai‘i Press, 2007), pp. 56-61

26 June 2007

Black Memphis vs. Black Nashville

I am still a devoted fan of Memphis because of my childhood memories and because of the progressive people—both black and white—who I know are working together today; but like its black elite, who were educated elsewhere, I feel it is a town trying to overcome great odds. The thriving downtown that it once had along Main Street—between Beale and Jefferson—was killed in the 1970s when the whites abandoned the increasingly black city, which now is only 44 percent white. There is not a department store within ten miles of City Hall. Big stores like Gerber's, Lowenstein's, and Bry's are all gone now. The area surrounding the municipal buildings, courthouses, and county offices is littered with pawnshops, bail bondsmen, and vacant storefronts. What would have long ago been a well-developed Mississippi River waterfront in any other town is just now seeing walking paths, green grass, and trees. With the exception of a few tall buildings built by the city's superior hospitals—Baptist and Methodist—and by First Tennessee Bank and Union Planters Bank, one gets the sense that no major company or industry calls Memphis its home. Federal Express is there—several miles out of downtown, near the airport, but the headquarters for Holiday Inn and Cook Industries left years ago.

Even the city's premier hotel, the Peabody—as plush as it is, by Memphis standards—seems a bit corny and anachronistic. Founded in 1869 and rebuilt in the 1920s, the imposing brick structure attracts tourists to its main lobby each morning for a ritual that began in the 1930s and continues today, seven days a week. At 11:00 A.M. sharp, an elevator door opens on the main floor, and marching in line across the carpeted floor are five trained ducks. Marching in unison to taped music that plays over the lobby speakers, the small ducks waddle toward a small, ornate fountain and pool in the middle of the floor. One by one, they hop up to the fountain and then dive into the pool. The routine is repeated in reverse at 5:00 each afternoon. Since the hotel had a policy of segregation thoughout my older relatives' lives, it was not until we were teenagers that they permitted us to visit the building and view this amusing event.

"Memphis used to have the largest and most developed metropolitan area in Tennessee," explained a black former city councilman who acknowledges that a fear of integration is what kept Memphis small and rather underdeveloped. "It can't be blamed on the people who are in power today," he says, "but those who were making decisions in the 1950s and 1960s created a problem between the races and within the corporate community that was hard to correct."

The city's black elite seem to be painfully aware of how much better their black counterparts are doing in Nashville—a city whose metropolitan area had once been less affluent, less respected, and less populated than that of Memphis. In fact, most of the Memphis black elite who had grown up in the city prior to the 1960s had to leave town and go to Nashville in order to get their education. Although the town had the small, all-black LeMoyne College since 1870, it lacked the truly elite black institutions that Nashville had: Fisk University and Meharry Medical College. The black Memphians also lacked Nashville's Tennessee State University, a black public college that ran itself like an elite private school.

"Although I grew up in Memphis—a city that looked down on Nashville at the time," explains a sixty-year-old physician who attended Fisk, "I always had the feeling that Nashville was going to catch up and then leave us behind—intellectually and racially. Memphis had no premier schools for whites or blacks, and Nashville had Vanderbilt for whites and these other top schools for us. White Memphians—and even some black Memphians—seemed to get more backward and more provincial as other cities outgrew us. So few blacks here were able to break out of the box and really gain national exposure the way that blacks in Nashville did."
SOURCE: Our Kind of People: Inside America's Black Upper Class, by Lawrence Otis Graham (Harper, 2000), pp. 276-277

23 June 2007

Independence Club: Decenter China, Elevate Korea

Formed in the spring of 1896, the Independence Club (Tongnip hyŏphoe) ... began a campaign to petition the king to rename the kingdom the Empire of the Great Han (TaeHan Cheguk) in order to make more explicit Korea's independence from China; in addition, the club urged Kojong to adopt the title of emperor (hwangje) in place of king (wang) in order to assume equal nominal status with the Chinese and Japanese emperors. Kojong, who had left Russian protection in July 1897, granted their wish; he took the title emperor and declared the first year of his new reign era Kwangmu (Illustrious Strength) in a coronation ceremony in October of 1897. The club also raised funds to erect a monumental arch, Independence Gate (Tongnipmun), on the site of the Gate of Welcoming Imperial Grace (Yŏng'unmun), where the Chosŏn kings had officially welcomed envoys from China. This project expanded to remake the former Chinese diplomatic residence, the Hall of Cherishing China (Mohwagwan), into a public meeting place renamed Independence Hall (Tongnipgwan), which they then surrounded with a public park. These were popular projects both at court and with the Seoul public, and they ended formally the usage of the now, in nationalist terms, humiliating tributary language of past Korea–China relations.

The club charted a course for a movement that encompassed public education, the creation of a national newspaper, and the beginning of language reform, all projects that anticipated the gradual emergence of a new public sphere in Korea. The club's newspaper was the vehicle for realizing, at least in part, all of these goals. The Independent used the vernacular script han'gŭl, which had been invented in the fifteenth century during the reign of one of Chosŏn's most revered monarchs, Sejong the Great. From that time Classical Chinese had continued to be the official written language of the court and elite communication, but han'gŭl was used for didactic tracts published for the peasantry and for popular translations of Confucian and Buddhist texts. The proliferation of novels written in han'gŭl in the seventeenth and eighteenth centuries had solidified its nonofficial use in society.

The Independent's use of han'gŭl was a deliberate statement about national cultural unity and linguistic identity. Editorials in the paper decried the use of Classical Chinese as the official language of government and literary language of the yangban. In a scathing editorial on the national language, Chu Sigyŏng (1876–1914), a young member who later became the founder of the modern vernacular movement, asserted that perfecting and spreading the use of han'gŭl was the principle means for "ending the habit of aristocratic cultural slavery to Chinese culture." This widened the attack begun against symbolic arches and imperial nomenclature on what the club perceived to be a slavish subordination of Korean elites to Chinese culture in general. This nationalist attack against elite identification with China began the process of transforming the very language used to describe Korean–Chinese relations. The term sadae (to serve the great) had heretofore simply described the old ritual relationship between Korea and China. But Chu turned it into an epithet that denounced subservience or toadyism to foreign culture in general. Subsequently, sadae and its various forms, sadaejuŭi (the doctrine or "ism" of subservience) or sadae ŭisik (a consciousness or mentality of subservience) became a trope for antinational sentiments or subservience to things foreign. In postcolonial and divided Korea this terminology still lingers in political and cultural discourse.
...
The gradual decentering of China in the Korean worldview had begun the redefinition of Korea as a nation-state, but moving Korean cultural identity away from any reference to China was neither an easy nor happy task. While Korea's participation within the cosmopolitan East Asian world order had made sense in a Sino-centric world order, within the particularistic logic of nationalism it was an anathema. This logic assumed that nations were the building blocks of the global order, with each claiming a distinct culture, history, and identity as a society. In East Asia the neologism used to represent the concept of nation—minjok in Korean—had been in use for at least thirty years before Koreans actually began to think and write about their society in such particularistic terms. The Chinese characters for minjok—min (people) and jok (family)—lend a unique quality to the term itself; so combined, these characters carry strong racial/ethnic and genealogical connotations. To this day, because of American stress on legal citizenship, an identity potentially open to all races and ethnicities because the United States is a nation of immigrants, Americans are surprised by the racial/familial emphasis carried within Korean national identity. In the period between 1905 and 1910, the first explorations of the evolution and character of the Korean minjok began to appear in calls for the rethinking of Korea's history.

A young editorial writer for the Korea Daily News, a man now celebrated as Korea's pioneering nationalist historian, Sin Ch'aeho, became one of the first to advocate writing a new, minjok-centered history for the nation. In "A New Reading of Korean History,"' a serialized essay published in 1908 by the Korea Daily News, Sin reread Korean history as a story of the Korean people (minjok), not its state (kukka) or its ruling family (wangjok). He attacked the tradition of Confucian historiography with its moral judgments of good and bad kings and its emphasis on the fortunes of the state. What was needed, according to Sin, was an account of the minjok from its earliest moments and of its contact and competition with its neighbors. In this view history became a story to bind together the people who comprised the national subject; the purpose of history was to celebrate the triumphs of the minjok and mourn its defeats, and to account for the evolution of its unique culture and identity into the present. Sin's "New Reading" emphasized the ethnic/racial difference of the Koreans from their neighbors by locating the origins of the Korean race in 2333 B.C. in the person of a mythological progenitor Tan'gun. Thus Sin reoriented Korean history as a story of a single people that was distinct from China or any other neighboring group. By locating the beginning of Korean history with Tan'gun, Sin sought to invalidate the Sino-centric myth of Korea's civilization being founded by a migrating Chinese official, Kija, a tale that had been in favor during the Confucianized Chosŏn period.
SOURCE: Korea's Twentieth-Century Odyssey: A Short History, by Michael E. Robinson (U. Hawai‘i Press, 2007), pp. 23-24, 27-28

21 June 2007

Quirky Minor League Team Names

In the Montgomery, Alabama, visitor center that used to serve as the city's Union Station, there's a cleverly named restaurant called Lek's Railroad Thai. It was there that I discovered that the city's minor league (AA) baseball team is called the Biscuits (2006 Southern League Champions). What a nice bit of self-mocking regional pride! Of course, we were headed that night for the home of the Columbus, Georgia, Catfish, a name that inspires such headlines as RiverDogs fry Catfish and Braves filet Catfish. And the next day, we were headed toward the hometown of the Savannah, Georgia, Sand Gnats.

A lot of minor league team names are not only boring, but predictable. Guess whose farm teams the following are: the Richmond Braves, Iowa Cubs, Sarasota Reds, Binghamton Mets, Reading Phillies, San Jose Giants, Springfield Cardinals, Potomac Nationals, Dunedin Blue Jays, Scranton/Wilkes-Barre Yankees, Omaha Royals, Kinston Indians, and Pawtucket Red Sox.

Other team affiliations are not dead giveaways, but pretty easy derivatives of the parent team name: the Tucson Sidewinders, Delmarva Shorebirds, Aberdeen Ironbirds, Rochester Red Wings, San Antonio Missions, Memphis Redbirds, Tacoma Rainiers, and Harrisburg Senators.

But my favorite team names are those with strong local flavor, and little reflection of their parent organizations: the Augusta GreenJackets (Giants), Cedar Rapids Kernels (Angels), Chattanooga Lookouts (Reds), Lansing Lugnuts (Blue Jays), Louisville Bats (Reds), Hickory Crawdads (Pirates), Great Lakes Loons (Dodgers), Rancho Cucamonga Quakes (Angels), Albuquerque Isotopes (Marlins), Lowell Spinners (Red Sox), Tennessee Smokies (Cubs), Mahoning Valley Scrappers (Indians), Durham Bulls (Devil Rays), Norfolk Tides (Orioles), Brevard Count Manatees (Brewers), and--my favorite--Modesto Nuts (Rockies).

Future team names I'd like to see are the Orange County Fruits, Gilroy Garlics, Salinas Lechuga, Monterey Squid, Madison Brats, Waukegan Wieners, Ozark Nightcrawlers, Bismarck Sugar Beets, Rapid City Rutabagas, and Smithfield (Virginia) Hams.

UPDATE: Isotopes Park in Albuquerque hosted the Triple-A All-Star Game on 11 July 2007.

20 June 2007

The Scottsboro Boys Revisited

Today's Opinion Journal features an editorial by John Steele Gordon that reminds us of another famous case of wrongful prosecution on the basis of race and class:
On March 25, 1931, a group of nine young black men got into a fight with a group of whites while riding a freight train near Paint Rock, Ala. All but one of the whites were forced to jump off the train. But when it reached Paint Rock, the blacks were arrested. Two white women, dressed in boys clothing, were found on the train as well, Victoria Price, 21, and Ruby Bates, 17. Unemployed mill workers, they both had worked as prostitutes in Huntsville. Apparently to avoid getting into trouble themselves, they told a tale of having been brutally gang raped by the nine blacks.

The blacks were taken to the jail in Scottsboro, the county seat. Because the circumstances of the women's story--black men attacking and raping white women--fit the prevailing racial paradigm of the local white population, guilt was assumed and the governor was forced to call out the National Guard to prevent a lynch mob from hanging the men on the spot. The nine were indicted on March 30 and, by the end of April, all had been tried, convicted and sentenced to death (except for the one who was 13 years old, who was sentenced to life in prison).

A year later, the Alabama Supreme Court upheld the convictions of those on death row, except for one who was determined to be a juvenile. By this time, however, the "Scottsboro Boys" had become a national and even international story, with rallies taking place in many cities in the North. Thousands of letters poured into the Alabama courts and the governor's office demanding justice.

The International Labor Defense, the legal arm of the Communist Party USA, provided competent legal help, and the convictions were overturned by the U.S. Supreme Court because the defendants had not received adequate counsel. Samuel Leibowitz, a highly successful New York trial lawyer (he would later serve on the state's highest court) was hired to defend the accused in a second trial, held in Decatur, Ala. This turned out to be a tactical error, as Leibowitz was perceived by the local jury pool--all of them white, of course--as an outsider, a Jew and a communist (which he was not). Even though Ruby Bates repudiated her earlier testimony and said no rape had taken place, the accused were again convicted, this time the jury believing that Ruby Bates had been bribed to perjure herself.

Again the sentences were overturned, and in 1937--six years after the case began--four of the defendants had the charges dropped. One pleaded guilty to having assaulted the sheriff (and was sentenced to 20 years) and the other four were found guilty, once again, of rape. Eventually, as Jim Crow began to yield to the civil rights movement, they were paroled or pardoned, except for one who had escaped from prison and fled to Michigan. When he was caught in the 1950s, the governor of Michigan refused to allow his extradition to Alabama....

Here is where the real difference between the Scottsboro boys and the Duke boys kicked in: not race but money. The Scottsboro boys were destitute and spent years in jail, while the Duke boys were all from families who could afford first-class legal talent. Their lawyers quickly began blowing hole after hole in the case and releasing the facts to the media until it was obvious that a miscarriage of justice had occurred. The three Duke boys were guilty only of being white and affluent.
But former defense attorney David Feige writes in Monday's Slate that prosecutorial misconduct is common but rarely punished.
Now that justice has prevailed in the Duke rape case, with the nice innocent boys exonerated and the prosecutor who hounded them disbarred, it is tempting to chalk the whole incident up to an unusual and terrible mistake—a zany allegation taken too seriously by a run-amok prosecutor. It would be pretty to think that Nifong's humbling suggests that our system of justice works well, harshly punishing the few rogue prosecutors who subvert the legal process. But this is simply not true.

Prosecutors almost never face public censure or disbarment for their actions. In fact, it took a perfect storm of powerful defendants, a rapt public, and demonstrable factual innocence to produce the outcome that ended Mr. Nifong's career. And because only a handful of prosecutors will ever face the sort of adversaries Nifong did or come close to the sort of scrutiny the former DA endured, the Duke fiasco will make little difference in how criminal law is practiced in courthouses around the country. Regardless of Nifong's sanction, the drama leaves prosecutorial misconduct commonplace, unseen, uncorrected, and unpunished.
One could say the same for all the myopic lemmings with pitchforks of the Fourth Estate, talk radio, university campuses, and the political blogosphere, not to mention the sorry political class worldwide.

17 June 2007

New Madrid: Spanish Influence at the Confluence

The name of New Madrid is but one indication that the Spanish once controlled the Mississippi River as far north as its confluence with the Ohio. A plaque erected by the Missouri Marquette Tercentenary Commission at Trail of Tears State Park on the river between Ste. Genevieve and Cape Girardeau reminds us of why Marquette and Joliet turned around near that point on the river.
In 1672, Louis Joliet and Father Jacques Marquette were commissioned by King Louis XIV to discover the course of the Mississippi River. On June 17, 1673, the expedition entered the Mississippi via the Wisconsin river and began their descent by canoe.

On July 4, 1673, the seven-man expedition passed the mouth of the turbulent and later observed the confluence of the Ohio and Mississippi. On reaching an Arkansas Indian village near present Helena, July 17, they were certain that the Mississippi flowed into the Gulf of Mexico. Fearful of the Spanish if they continued southward, at this point Father Marquette and Joliet turned back.

A dedicated and gentle priest, Father Marquette first brought the Word of God into the Mississippi Valley, gave the world an account of its lands and, with Joliet, laid the basis for France's claims to the area.

Born in Laon, France, June 1, 1637, Father Marquette died April 18, 1675, on the eastern shore of Lake Michigan from the hardships of his missionary life.
The Spanish were still influential at the time of the Meriwether Lewis and William Clark expedition in 1804–1806, as a State Historical Society of Missouri signboard at the Trail of Tears State Park notes.
Writing in 1803, Nicholas de Finiel, a French military engineer, described the Shawnee villages along Apple Creek that Lewis mentioned: "These villages were more systematically and solidly constructed than the usual Indian villages. Around their villages the Indians soon cleared the land, which was securely fenced around in the American style in order to protect the harvest from animals. The first of these villages is located five or six leagues from Cape Girardeau, along the road to Ste. Genevieve..."

Shawnee presence in the area was a matter of international politics. Shawnee and Delaware Indians from Ohio were invited to Cape Girardeau in the 1780s by Spain's district commandant, Louis Lorimier, who had traded with those tribes in Ohio. Spain, which governed the Louisiana territory then, welcomed the "Absentee Shawnee" with ulterior motives. It believed they would be a buffer against the Osage and against American ambitions to expand their borders. Coincidentally, Gen. George Rogers Clark, William Clark's older brother, had burned Lorimier's Ohio post because Lorimier sided with the British during the American Revolution.
An historical marker on the levee at New Madrid calls it "The first American town in Missouri":
Founded in 1789 by George Morgan, Princeton graduate and Indian trader, on the site of Francois and Joseph Le Sieur's trading settlement, L'Anse a la Graise (Fr. Cove of Fat). Flood and caving banks have destroyed the first town site.

Named for Madrid, Spain, the town was to be an American colony. Morgan was promised 15 million acres by the Spanish ambassador, eager to check U.S. expansion with large land grants. Spain did not confirm his grant but gave land to colonists. Morgan left but he had started American immigration to Missouri.

French and American settlers contributed to town growth. Here were founded a Catholic church, 1789; a Methodist church, 1810; and here was the southern [northern?] extent of El Camino Real or King's Highway, 1789. There are over 160 Indian mounds in the county, two near town.

"Boot Heel" counties, including a strip of New Madrid, are said to be part of Missouri through efforts of J. H. Walker (1794-1860), planter at Little Prairie (Caruthersville), Pemiscot Co. In nearby Mississippi Co. is Big Oak Tree State Park, a notable hardwood forest.

A Pack, Not a Herd, Takes on a Pride


An incredible Battle at Kruger via Mississippi to Korea

Atlanta's Black Aristocracy

There is no major metropolitan area that has a better-organized black upper class than the city of Atlanta. Exerting its power in the worlds of politics, business and academia, Atlanta's black elite sets the gold standard for its counterparts in other cities.

"We've had three black mayors with national reputations," says my friend Janice White Sikes of the city's Auburn Avenue Research Library on African American Culture and History, the nation's best collection of black Atlanta history documents. "We are home to the best-known historically black colleges. And in addition to hosting the Olympics we have some black-owned companies that are the oldest of their kind in the country."

Although she has spent most of her career researching and writing about an older, more rural Georgia, it is obvious that what excites Sikes most as we sit in the dining room of the Atlanta Ritz-Carlton is talking about the new Atlanta and how the black community has played a role in making it one of the most popular destinations for elite blacks in search of a city where they are in control.

"This city produced older civil rights leaders like Dr. Martin Luther King Jr., Julian Bond, and Congressman John Lewis," she adds while looking over some notes describing her uncle, a black class-of-1933 Harvard graduate, "but Atlanta has also elevated people like Andrew Young, Maynard Jackson, and Johnetta Cole to national standing in recent years."

Unlike other cities of its size and sophistication, Atlanta has seen a black elite forge strong enough ties between blacks, whites, and the business communities of both groups to elect three consecutive black mayors. What is also interesting is that Maynard Jackson, Andrew Young, and current mayor William Campbell are solidly representative of the black upper class—a characteristic that historically has not been welcome in black electoral candidates in cities like Washington, Chicago, or Detroit. In fact, when Marion Barry and Coleman Young of Washington and Detroit, respectively, were campaigning in mayoral races, they bragged about their ties to the urban working-class community. In Atlanta, good lineage, money, and top school credentials are appreciated by the black mainstream.

In addition to excelling in political clout, black Atlantans outstrip other cities' elite in the area of college ties. Atlanta's black academic community is larger than any other city's because of prestigious schools like Spelman, Morehouse, Morris Brown, and Clark Atlanta. When former Spelman College president Johnetta Cole received a $20 million gift from Bill and Camille Cosby (she is a Spelman alumnus) in 1993, other cities and their black colleges took notice of the strong black university consortium that was growing on the southwest side of Atlanta.

And further reinforcing the role and place of the black elite in the city are its black-owned businesses. While it does not outnumber New York or Chicago in black entrepreneurs, the city does claim the nation's largest black-owned insurance company (Atlanta Life), the largest black-owned real estate development firm (H. J. Russell), and some of the country's top black-controlled investment firms, law firms, auto dealerships, and food service companies.
SOURCE: Our Kind of People: Inside America's Black Upper Class, by Lawrence Otis Graham (Harper, 2000), pp. 321-322

14 June 2007

Wordcatcher Tales: Haint Blue

In Savannah, Georgia, last month the Far Outliers toured the Telfair Museum of Art's Owens-Thomas House, where we saw haint blue paint on the walls and rafters of the former slave quarters that now serves as a gift shop, waiting room, and exhibit (upstairs). Such blue paint is common in areas influenced by slaves from Africa.

The blue paint is said to ward off evil spirits and, by some accounts, insects. I lean toward the more practical explanation, for reasons elaborated below, but first I want to note an odd set of sound correspondences, where one member of each pair is not just nonstandard, but highly stigmatized.
  • haint ~ haunt
  • aint ~ aunt
  • ain't ~ aren't (in r-less dialects)
  • cain't ~ can't
I don't know anyone who pronounces every member of the set with the ai vowel. Nor do I know anyone who has the same vowel in each member of the set. Nowadays, I pronounce each with a different vowel: (roughly) hawnt, ahnt, arnt, kænt. As a kid, I used to say cain't (as my father still does), but I made a conscious effort to expel such (self-)stigmatized regionalisms from my speech during my youth. Worse yet, I used to tease my Southern Baptist missionary kid cohorts who returned from their furlough years with their regional accents in full flower. Some of my southern Virginia relatives also pronounce aunt the way Andy Griffith did in the name of Aunt Bee on Mayberry RFD (said to be based on Mt. Airy, NC), but I don't know anyone who pronounces haunt the same way, except in jest.

Has anyone else noticed this odd correspondence set? Are there other possible members of the set?

Enough linguistics; now back to insects. Last year in Japan, I heard that indigo dye had mosquito-repellent properties, among other magical qualities. Historian and librarian Jennifer Payne has compiled some interesting evidence for the beneficial effects of indigo plantations, not just its blue dye. Here are a few excerpts (omitting footnotes).
Agriculture, disease, and slavery were three basic and interconnected aspects of life in Colonial South Carolina. Where one existed, the other two were sure to follow within a very short time. By the mid eighteenth century, rice culture, slavery, malaria and yellow fever were well established as a self-perpetuating cycle which had an adverse effect upon the life spans of the colonists. This study examines the establishment of the "rice-slavery-disease" cycle, speculates on how this cycle was broken by the introduction of indigo, and postulates how indigo effected the yellow fever/malaria mortality rates of Colonial South Carolina....

During the very same fifty years in which indigo took hold in South Carolina, an interesting phenomenon occurred. Persons in Berkeley County near Charleston began to live longer; the number of persons dying during the malarial months [August through November] began to drop. Furthermore, the frequent outbreaks of yellow fever in Charleston began to slow down and eventually, for a time, discontinue entirely....

The most dramatic change occurred between 1760 and 1800 during the years in which indigo gained its height. Only 20% of the males died before forty and some 45% lived to be sixty or more. Moreover, only 18% of adult women died before fifty and some 70% survived beyond seventy. Those statistics involving women are especially revealing for women tended to become victims to malaria during their childbearing years. The fact that a greater percentage of the female population survived past fifty is significant. Thus, according to this evidence, something was enabling the people of Christchurch and St. Johns parishes in Berkeley county to survive malaria and malarial complications during the last forty years of the eighteenth century....

Why was there a decline in malarial mortality and a cessation of yellow fever epidemics? One medical historian jokingly suggested that perhaps the Mosquitoes simply went away for forty years. This might be true. Interestingly, the yellow fever epidemics ended just as indigo gained ground as a staple cash crop. Even more fascinating is the fact that the yellow fever epidemics resumed as indigo culture was rapidly phased out after the Revolution. Although in 1788, 833,500 pounds of indigo were being exported, in 1790, only 1694 casks of the stuff were exported. By 1796, indigo had been virtually eliminated from the agricultural economy. Conversely, the epidemics raged within three years of this decline. Thus, it is quite possible that the introduction, rise, and subsequent fall of indigo production had an effect upon mortality rates in colonial South Carolina....

Was it simply coincidence that yellow fever and malaria experienced a decline during indigo's rise, or are the two related in some manner[?] Whatever the connection between indigo and the mosquito is, the is little doubt that during the years of indigo's sudden and swift rise in cultivation, the number of people dying from malaria related complications and those dying from yellow fever dropped markedly. Eliza Lucas Pinckney introduced a new cash crop which helped to make South Carolina one of England's wealthiest colonies. However, her actions might have also helped the population of South Carolina reduce the fever mortality rates. The introduction of indigo broke the vicious cycle of rice cultivation, slavery, and fever by introducing a method of agriculture which did not rely on large amounts of standing water. Furthermore, the return of yellow fever epidemics in the mid 1790's coincided with the rapid decline of indigo production due to the loss of the incentive of the bounty. Although the exact nature of indigo's influence on the mosquito can only be speculated, research conducted to date indicates the probability of a connection between the two.

11 June 2007

Chicago's Black Elite

First founded and settled by the black explorer Jean-Baptiste DuSable of [Saint Domingue =] Haiti, in 1773, Chicago was begun as a thirty-acre land parcel. DuSable, working as a fur trapper and trading-post operator, eventually owned in excess of four hundred acres. He and his new Native American wife remained in the area until 1800, when he moved to Missouri.

With an early black population that was much smaller than those of southern cities like Washington, Memphis, Atlanta, and Richmond, Chicago had a small black elite in the mid- and late 1800s—it consisted of only a few families. Most of them lived very integrated lives: They interacted while working together with liberal whites who had been abolitionists when the Underground Railroad moved black southern slaves into the North. The black elite of the period included people like physician Daniel Williams, Pullman Train Company executive Julius Avendorph, caterer Charles Smiley, and attorney Laing Williams. They were all educated people who lived, worked, and socialized among whites. "In fact," says Travis, who also wrote the book Autobiography of Black Chicago, "at that time, there were blacks living throughout the North Side and elsewhere. Though we were small in numbers, we were represented in every census tract."

Travis points out, however, that the total black population was still under fifteen thousand people. It was not until around World War I, the time of a major black migration from the South to the North, that a substantial black population arrived in the city. Most of these black southerners came—about seventy thousand of them between 1900 and 1920—as a result of the Chicago Defender, a black newspaper that was read in the South by educated blacks eager to escape their more rural environment. When these blacks arrived in town, the old-guard black families and their social clubs immediately decided who was "in" and who was not. Truman Gibson's parents and Maudelle Bousfield Evans's parents were clearly "in" as far as the black old guard was concerned. Interestingly, as old-guard blacks were busy trying to separate the "society blacks" like themselves from the new working-class arrivals, whites were making plans to ghettoize both groups together on the South Side. And they quickly did so by establishing restrictive covenants that moved blacks out of white areas.

In fact, the white community responded quite aggressively to black mobility during the early years of World War I. In the working-class and middle-class white neighborhoods that saw blacks moving in, white residents simply bombed the houses or set them afire. In more upscale neighborhoods like Hyde Park, which surrounds the University of Chicago, white residents organized a full-blown plan to preempt any sales to upwardly mobile blacks who might be able to afford homes in the well-to-do community. My Uncle Telfer, who died before the upscale neighborhood allowed blacks to buy homes there, had saved a copy of Hyde Park's neighborhood newspaper, published in 1920, which read, "Every colored man who moves into the Hyde Park neighborhood knows that he is damaging his white neighbor's property. Consequently ... he forfeits his right to be employed by the white man.... Employers should adopt a rule of refusing to employ Negroes who persist in residing in Hyde Park."

Soon after that time, restrictive covenants making it illegal to sell homes to blacks, regardless of their wealth, were strictly enforced.

But regardless of how violently whites reacted to the influx of poor and upwardly mobile blacks, the old-guard blacks of Chicago had their own dismal way of responding to their fellow blacks in this northern city. They were not happy to see them arriving.

"Not surprisingly, elitism was quite evident. But the rules governing black society in Chicago were always slightly different from the rules that were used in the southern cities," explains former Chicago Defender society columnist Theresa Fambro Hooks. "In the South, black society was determined by the years your family had lived in a particular city and by their ties to one or more of the nearby black colleges like Howard or Fisk or Spelman. But the rules were different in Chicago because almost everybody was new—almost all of them had migrated from the South. There were very few old families and there were no old local black universities to be tied to."

So the standard for black society in Chicago became, instead, financial success and, to a lesser extent, family ties to a few of the northern white universities. In both regards, the Gibson and Bousfield families were at the top. Acceptance by the right schools, the right churches, and the right clubs proved that.
SOURCE: Our Kind of People: Inside America's Black Upper Class, by Lawrence Otis Graham (Harper, 2000), pp. 189-190

10 June 2007

Paducah's Men in Quilts

One of the artistic highlights of our recent Great Square Route around the eastern U.S. (MN - MS - GA - CT - MN) was the stunning Museum of the American Quilting Society in Paducah, Kentucky, which had just opened a special exhibit, 4 Guys & Their Quilts:
On exhibit May 16-August 12, these quilts combine the talents of four male award-winning quilters: John Flynn, Gerald E. Roy, Arturo Alonzo Sandoval and Ricky Tims. MAQS Curator of Collection Judy Schwender is proud to bring lesser known viewpoints from the quilting world to the Museum’s visitors.

“Any quilt reveals the sensibilities of its maker, and men bring perspectives to quilting that are unique to the medium,” Schwender explains. “Within the world of quilting, men are a minority, and the museum is committed to presenting quilting viewpoints of underserved populations.”
My favorite among the 4 Guys was Ricky Tims, whose work ranges from exquisite variations on traditional quilting patterns, like his Bohemian Rhapsody or New World Symphony, to renditions in fabric of depictive art that would not look out of place on a framed canvas or in stained glass, like his South Cheyenne Canyon or Glen Eyrie Castle.

Among the new quilting terms and techniques I learned about at the museum was trapunto (also called "stuffed work"), a texture-enhancing technique that Tims puts to fine use in his Rhapsody in Green.

09 June 2007

Minnesota's Canned Corn and Carp

A 1997 Minnesota Historical Society plaque at a rest area near the state line on I-35 tells a bit about the history of Minnesota's canneries.
Early settlers grew bumper wheat crops on Minnesota's fertile prairies, land that today supplies produce for a thriving 270-million-dollars-a-year canning industry.

Sweet corn canneries opened in Austin and Mankato in the 1880s, followed soon thereafter by similar factories in Faribault, Owatonna, and LaSueur. Soon Minnesota's canners were experimenting with new technologies and new products, and in 1903 the automated Big Stone Canning Company founded by F. W. Douthitt changed the industry nationwide. Douthitt's plant in Ortonville had a conveyor system, mechanical corn husking machines, and a power driven cutter that produced the first whole kernel canned corn. The Green Giant Company, also founded in 1903 as the Minnesota Valley Canning Company, introduced golden cream-style corn in 1924 and the first vacuum packed corn in 1929.

Corn is still the major canning crop in Minnesota. The state's more than thirty plants also freeze and can peas, beans, carrots, tomatoes, pork, beef, chicken products, and such unusual items as rutabagas. Mankato was the site of the nation's first carp cannery in 1946.
For more on canned carp, read Dumneazu's well-illustrated blogpost on the Odessa Fish Market. In fact, just keep scrolling for an incomparable travelogue series on Dumneazu's recent adventures in Ukraine.

Iowa's Passing Mormons and Utopians

A State Historical Society of Iowa plaque at a pretty little welcome center off Exit 4 on I-35 in Iowa tells two interesting stories, one on each side.
The Mormon Trail

The Mormons of Nauvoo, Illinois, forced from their homes following the murder of their prophet, Joseph Smith, Jr., began their trek across Iowa in 1846 on their way to the Great Salt Lake Valley. From their first permanent campsite on Sugar Creek they travelled across southern Iowa toward Winter Quarters, near present-day Omaha. In addition to Sugar Creek, the Mormons also established permanent camps at Garden Grove in Decatur County, Mount Pisgah in Union County, and Kanesville in Pottawattamie County.

While camped by Locust Creek, near Corydon, William Clayton learned of the birth of his son in Nauvoo. On April 15, 1846, to commemorate this joyous event, he wrote the famous hymn "Come, Come, Ye Saints." The hymn became a great rallying song of the Mormons.

In 1846, seven Mormon families became separated from the larger body of migrants. They stopped for the winter in present-day Green Bay Township, Clarke County, and established what was known as "Lost Camp." These families remained in the area until 1854, when they resumed the trek to Utah.

Utopian Experiments in Southern Iowa

Several utopian groups attempted to implement in southern Iowa their dreams of a better social structure. In 1839, Abner Kneeland, a pantheist, started Salubria in Van Buren County. Beset with economic problems, the experiment dissolved after Kneeland died in 1844. In 1843, followers of French socialist Charles Fourier founded Phalanx in Mahaska County, but this communal experiment lasted only two years. Followers of another Frenchman, Etienne Cabet, tried several experiments in the United States, including Icaria in Adams County, which existed from 1860 to 1895.

Led by Ladislaus Ujhazy, a group of Hungarian refugees from the Revolutions of 1848 settled in Decatur County in 1850 and founded the town of New Buda. After experiencing economic difficulties, most of these people moved to Texas in 1853.

In 1851, people from near Farmington formed a communal association called the Hopewell Colony. They moved to Clarke County later that year, and founded the town of Hopeville. Although the communal nature of the colony soon changed, the village survived and for several decades was a thriving community. It is the only one of these southern Iowa utopian experiments whose remnants lasted into the 20th century.
Wisconsin also seems to have attracted more than its share of utopians, these days confined mostly to Madison, I suspect.
The best-known communal experiment in Wisconsin was the Wisconsin Phalanx, a community based on the principles of Charles Fourier, established at Ceresco (Ripon). It was the second largest Fourierist experiment in the country, lasting from 1844 until 1850, and housed around 180 people, most of whom lived communally in the Long House. Although the Phalanx was an economic success and attract[ed] national attention, problems developed and the members agreed to dissolve their community in 1850.

06 June 2007

Wordcatcher Tales: Gum naval, Jump-butt, Stumpage value

On the road from Columbus to Savannah, Georgia, during our recent Great Square Route around the eastern U.S. (MN - MS - GA - CT - MN), we stopped at the Million Pines Visitor Center off I-16 in Soperton, Georgia. The visitor center includes the Curt Barwick House, built of wood about 1845, which houses the front desk, gift shop, restrooms, and various display items; a one-room wooden house with a tin roof that served as the post office for Blackville, Georgia, from 1888 to 1904; and a wooden shed containing tools used to produce gum naval stores.

The latter term was new to me. It bears no relation to naval jelly (phosphoric acid), which is used on iron ships. Gum naval dates back to the days of wooden ships, when Georgia played an important role in the naval stores industry, as the New Georgia Encyclopedia relates:
In the late nineteenth and early twentieth centuries, Georgia was the world's leading producer of naval stores, which are materials extracted from southern pine forests and then used in the construction and repair of sailing vessels. Typical naval stores include lumber, railroad ties, rosin, and turpentine.

The naval stores industry in North America originated in the mid-eighteenth century in North Carolina. Before 1800 the major products of the trade were raw gum, pitch, and tar. After the American Revolution (1775-83), processes were developed for distilling spirits of turpentine from gum. By 1850, 96 percent of U.S. naval stores came from North Carolina.

In the early 1870s North Carolina naval stores producers began migrating to southeast Georgia's sandy coastal plain to take advantage of the untapped virgin pine forests in that region. They brought their equipment and black laborers and established residential villages on large turpentine farms. By the mid-1880s about seven in ten turpentine workers in southeast Georgia had been born in North Carolina.

The industry grew so rapidly that by 1890 Georgia was the national leader in naval stores production, a ranking that lasted until 1905. Florida was the leader from 1905 to 1923, after which Georgia regained its predominance and maintained it until the 1960s.
The USDA Forest Service Southern Research Station Headquarters in Asheville, North Carolina, describes some of the nitty gritty of production. Here are two photo captions from their website:
Photo caption: Improved gum naval stores extraction methods require new tools and techniques. Bark streaks 9 feet from the ground require a special long handled tool for pulling the streak and safely applying the acid. A combination bark-pulling and acid-treating tool was designed to meet this need. The laborer is shown applying 50-percent sulfuric acid to a streak 8 feet from the ground. This tool enables a laborer to stand a safe distance from the tree and reduce the hazard of acid drifting down on his head and clothes.

Photo caption: No more jump-butts and wasted timber as a result of turpentining. A turpentined tree containing both front and back faces and worked for 8 years is shown entering a German gang-saw to produce quality lumber. Developing conservative gu[m] extraction methods for the gum producer represents only half the problem, research must also prove to wood using industries that modern turpentining does not impair the stumpage value of the worked out tree.
The punctuation in the second caption sucks rather badly, but the wonderful collocations make up for it. Jump-butts in this context seems to refer to the discarded lower portion of turpentined trees. Stumpage value is the calculated value of standing timber. The butt log is the often slightly irregular log taken from the base of a tree.