Finance in History: Labor Days
The Lowell Mills offer a lesson in the perils of focusing on labor costs at the expense of technology.
The building of Samuel Slater’s mill in Pawtucket, Rhode Island in 1793 marked a genuine paradigm shift: the transition of cloth-making from the home to the factory. A decade or so later, wealthy Bostonian Francis Cabot Lowell followed Slater’s example by surreptitiously copying English spinning and weaving technology. After visiting the cotton mills of Manchester, England and taking copious mental notes, he returned to Boston and raised $400,000 from wealthy friends and family to recreate what he had seen in Great Britain.
Thus began the American Industrial Revolution, and with it, another sort of shift. The new cloth-making business put both capital and labor to work on a scale that demanded not just new machines, but new management. Unfortunately, the accounting and financial technology of the day wasn’t up to the task. Financial managers of the time focused on the familiar — costs of labor and materials — but oddly enough, often ignored the potential challenges of maintenance, obsolescence and technological change that came with their new machines. Without a good understanding of the importance of depreciation and reserves, writes one historian, “The known expense of labor received more attention than the largely unknown problems of capital expense.”
Initially, however, a management focus on labor seemed a happy development. An idealist, Lowell did his utmost to improve upon the grim working conditions he had witnessed in England, where, in the early 1800s, English laborers had no minimum wage and generally worked twelve to fourteen hours a day, six days a week.
Lowell set about creating a worker’s utopia. He recruited girls and women, ages 15 to 35, from surrounding farming communities and promised their understandably wary families that they would live in chaperoned boardinghouses and have access to a church, a library, and healthy social activities. They would receive weekly wages, an unheard-of luxury for a farm girl, even if she did have to work six 10- to 12-hour days (almost as long as her English counterparts) to earn it.
Lowell’s five-story factories were a brilliant early construct of vertical manufacturing. Each mill had machines to clean the raw cotton, turn it into yarn and thread, weave it into cloth, and then print the finished cloth with colorful designs. The U.S. Congress helped matters considerably by imposing prohibitive tariffs on imported cloth, protecting the Massachusetts producers from their British competition.
If the water wheels powered these mills along the Merrimack, treasurers ran them. Sitting at the top of the largest early American companies, treasurers (not presidents) held shares in their organizations and conveyed the wishes of the shareholders in Boston to the agents who managed the mills in Lowell. Although flawed, the structure made sense. For agents, labor costs were paramount, while shareholders worried most about the cost of raw cotton and the price of cloth — the most important U.S. export of the early 19th century. Detailed accounting information provided essential communication between managers and investors separated by the miles between Boston and Lowell.
As early as 1826, Lowell’s utopia began to give way to competitive pressures. England, which bought much of America’s raw cotton, continued to turn it into cotton cloth at a ferocious pace. In response, the U.S. mills tried to increase productivity by speeding up production and productivity. A woman who had once tended one loom soon found herself tending four.
In 1834, the Lowell Mill’s directors tried another tack — cutting fixed labor costs. When management announced that the women would have their wages cut, the Lowell Mill girls, as they were called, went on strike, or in the language of the day, they “turned out.” After only a few days, the strike collapsed and their attempt to forestall the wage cut failed.
Two years later, management decided it had to cut costs again, though not its own, and again it targeted its women operatives. Pay was to be cut by $1 a week, and simultaneously, the amount the girls paid the company for their rooms in the boardinghouses was to increase. At the time, they were sleeping two to a bed and eight to a room. This time, over a thousand women turned out, striking for several weeks.
Throughout, the women’s efforts to improve their working conditions were undermined by the willingness of later immigrants, first Irish, then Italian, to take whatever wages were offered. Ultimately a six-year depression that began in 1837, brought on by overly easy bank credit and rampant real estate speculation, ended any attempt at labor organization. Jobs disappeared by the thousands, and what little power the fledgling labor movement had evaporated.
Of course, if savings on labor costs created intolerable conditions for mill workers, the demand for cheap cotton had bred an even more ghastly system. The raw material used in the mills came from the South, and was grown and picked by slaves. Two-thirds of the Southern cotton was sold to England. The other third was shipped north to New England. Many workers sympathized with the plight of slaves and supported abolitionism, but also suffered themselves when the Civil War broke out. Realizing that they would make more selling raw cotton than by making cloth, Lowell’s mill owners closed their mills and sold off the contents of their warehouses.
Many of the mills reopened after the war, but eventually most moved to the South themselves. Although most attribute this development to the lure of cheaper labor and proximity to raw materials, another factor played a part. As the management hierarchy of the mills suggests, investors focused on finance and labor. Responsibility for technology — specifically, the machines that spun, wove, and finished the cloth — was relegated to an outside superintendent. As Steven Lubar reports in “Managerial Structure and Technological Style: the Lowell Mills, 1821-1880,” shareholders challenged the need for skilled (and therefore costly) managers for these machines. Neither the management system nor the accounting systems (this was before the day of useful cost accounting) fostered an appreciation for the role technology played in operations. As a result, the Lowell mills were slow to repair and slower to invent more efficient machinery. In the end, operators found it simpler to start over in a new location than to repair old machinery.
Today, of course, even the southern mills are closed, with almost all textiles made overseas. But you can still visit the remarkable cotton mills of New England. They’re museums.
Finance in History: Blood and Treasurers
Those who guard the crown jewels need good internal controls.
Roget’s Thesaurus has made a bizarre word familiar to many college students who have found themselves at a loss for words. Compiled by Dr. Peter Mark Roget and published in 1852, Roget’s Thesaurus is a vast categorization of English words — and their friends, siblings, and relatives.
But how did he come up with a word like “thesaurus?” Simple. It’s the Latin word for “treasure.” Back in the 15th century in Scotland, treasurers were called “thesaurers,” and the royal thesaurer had the plum job of guarding the royal treasure trove.
To become thesaurer, a fellow clearly had to be known for his honesty, strength, courage, martial experience, suspicious mind, and self-restraint. One wonders how often the inventory of the royal thesaury (treasury) was conducted and whether the King and Queen were there to congratulate themselves on their fine thesaurus.
Besides the psychological comfort of knowing you have a pile of jewels in a vault nearby — and a trusted thesaurer to make sure they don’t wander off — the king’s jewels must have helped convince lenders of his creditworthiness. A bit like Fort Knox when the United States was on the gold standard.
The flaw in that idea, though, is that crown jewels are anything but a liquid asset. They represent, instead, the classic buy-and-hold strategy. The British gem collection is a 900-year long position in precious stones and metals.
Despite the manifold and elaborate precautions taken by the thesaurer, an audacious brigand almost got away with stealing Britain’s crown jewels in 1671. The perpetrator was an Irishman with the improbable name of Colonel Blood, and he did it by preying upon the Assistant Keeper of the Jewels, an elderly dupe named Talbot Edwards. Revenge certainly played a part in the bloody plot, seeing that the British had taken Blood’s land in Ireland.
Disguising himself as a humble man of the cloth, a parson, Blood made several preliminary visits to the Tower of London, intent on insinuating himself into the good graces of the assistant jewel keeper. Like so many visitors to London who were soon to follow, he took the Tower tour to give the crown jewels a good once-over. The jewels first went on display in the 1600s, and even back then the jewel keeper was allowed to make some money on the side acting as tour guide.
After several increasingly chummy visits, Blood went so far as to propose that his nephew marry Edwards’s daughter, a nice match considering he claimed the nephew was worth 300 pounds a year. The assistant jewel keeper and his wife thought this sounded like a bit of all right.
A few days later, Blood brought his “nephew” (actually his son), to meet Edwards, and they were accompanied by two of their friends. While supposedly waiting for Blood’s wife to join them, Blood persuaded the jewel keeper to show him, his nephew, and their two companions the jewels one more time.
Once Edwards unlocked the vault, they decided the time was especially opportune to bash him in the head with a mallet and stab him to death. Scooping up the jewels, Blood crushed the king’s crown, the better to hide it under his frock. Before they could make their pious exit, however, Edwards’s son stumbled in on them and raised a hue and cry. The plunderers were apprehended, probably by a cohort of the Tower guards, the Beefeaters. The lucky king reclaimed his jewels and dented crown.
Besides housing the crown jewels, the Tower of London was the home of many famous prisoners. Some, including Richard III’s two nephews, Anne Boleyn, Lady Jane Grey, Sir Thomas More, and Guy Fawkes, never left. Queen Elizabeth I, Sir Walter Raleigh, and Rudolph Hess, on the other hand, were only temporary residents.
Weirdly enough, Colonel Blood never joined their ranks. King Charles II met with him after his disastrously botched heist, gave him back his confiscated Irish estates, and is thought to have taken him into his service as a spy. The moral of the tale, apparently, is that the bold entrepreneur often ends up a whole lot better than the treasurer.
Finance in History: Bankruptcy
Chapter 11 may be tough, but it beats death, dismemberment, slavery, exile, prison, and other insolvency solutions.
“Annual income twenty pounds, annual expenditure nineteen nineteen six, result happiness. Annual income twenty pounds, annual expenditure twenty pounds ought and six, result misery.” (Charles Dickens, David Copperfield)
Misery indeed. Bankruptcy is no picnic even today, but through the ages it has been the source of much literal pain. The word “bankruptcy” comes from an Italian practice of the Middle Ages — “banca rotta” — which means “bench-breaking.” The term describes the punishment administered to businesses that failed: local fiscal authorities came to the market and smashed the bankrupt business’s table.
Through the ages, people or businesses have gone bankrupt in two ways, either running out of money and thus being unable to repay debts, or being forced to close as a result of fiscal mismanagement. Either way, bankruptcy has often carried a punitive dimension.
Death, dismemberment, slavery (for the debtor and family members), indentured servitude, exile, and debtors’ prison have all been used as punishment. Dickens did not use the word “misery” lightly.
And yet the first known effort to regulate bankruptcy was surprisingly modern in its approach. Appearing in the Code of Hammurabi, which dates to Babylon around the 18th century B.C., the law stipulated that a bankrupt’s possessions were to be divided among creditors in proportion to the amount of money each was owed.
Alas, those would soon come to be seen as the good old days, because by 621 B.C., when Draco ruled Athens, the punishment meted out to “deadbeats” (literally, one who is “completely exhausted”) was death. Or they and their families might be sold into slavery, with the proceeds going to creditors. If that strikes you as Draconian, well, consider the source.
A generation later the Athenian statesman and poet Solon decided this was perhaps a bit too severe. Under his legal reforms the bankrupt and his family had to give up their citizenship but not their freedom — or their lives.
The Romans, however, soon turned back the sundial. Under the Twelve Tables of Rome, promulgated in 451 B.C., maiming became the appropriate sanction. Instead of getting his money back, the creditor was given a pound of flesh — or perhaps more, depending on how much was owed. Debtors were cut up and their parts distributed among creditors on a pro rata basis. (The Roman writer Petronius would later satirize this practice in The Satyricon, a portion of which describes a plutocrat whose will decrees that any friend, parasite, or hanger-on who wants to collect his inheritance must eat a piece of the dead man’s corpse.)
Fast-forward to Renaissance England, where Henry III established the practice of imprisoning debtors in the 13th century. By the time of Henry VIII, in the mid-16th century, the first bankruptcy statute (as opposed to insolvency law) was enacted. It applied only to merchants and traders, since they were considered the only people who had a legitimate reason to borrow money, and provided a way for their debts to be addressed (sans death, torture, or even prison) in the event that a storm at sea sank their boats and thus their fortunes, or similar circumstances beyond their control led to bankruptcy.
That statute did not get the common man off the hook, however. And once someone landed in debtors’ prison it was often nearly impossible to get out. Family or friends might come forward to pay the prisoner’s debts; if not, debtors had to rot, presumably coming to appreciate, as they did, the error of their ways.
The absurdity of debtor’s prison, of course, is that a bankrupt’s ability to repay his creditor from prison is precisely nil. That may be why, in some countries, creditors were required to pay the costs of incarcerating their debtors. The open-ended prison sentence could be cut short, therefore, should the creditor tire of throwing good money after bad.
If you were lucky you might end up a “peon,” a term that originally described a bankrupt person condemned to work without pay for a creditor until the debt was paid off.
While bankruptcy was generally a bigger problem for the debtor than for the creditor, that wasn’t true in every case. In the 14th century, Italian bankers unhappily discovered that England’s Edward III was an unreliable credit risk, but couldn’t do much about it. And in the 18th century, English goldsmiths, the principal bankers of the era, slid into bankruptcy after the Stuart kings found it inconvenient to pay back their loans. Worse, if the bankers were deemed to be charging too much interest their fingers would be burned.
Today bankruptcy still entails pain, if only in the form of many, many meetings with lawyers. And Dickens’s lesson still rings true: having slightly more than you need is infinitely better than having even slightly less. Unless, of course, your credit card offers rewards points and a low introductory rate.
Still to Cont……………………………………………………………………
Always Yours— as Usual———–Saurabh Singh