News

Why the Inventor of the Cubicle Came to Despise His Own Creation

Why the Inventor of the Cubicle Came to Despise His Own Creation

How do you envision corporate hell? It probably involves fluorescent light, a micromanaging boss and a tiny, impersonal cubicle. But the office layout that’s come to represent the worst in work was actually designed to bring out the best in workers. When they debuted in the 1960s, cubicles were supposed to make offices breezier, less confined and more efficient. So why did their creator come to wish he’d never invented them?

It started in the 1960s, when designer Robert Propst headed up the research arm of furniture manufacturer Herman Miller. At the time, the company was known for triumphant mid-century design like Eames chairs and playful sofas modeled after marshmallows.

Propst wanted to understand every aspect of the modern office—and found inspiration in his own working style. He realized that he was happier, healthier and more productive when he had different surfaces on which to work. The research he commissioned from design experts and efficiency specialists showed that office spaces that were flexible and customized produced better work than a layout that depended on immovable, heavy desks.

That flew in the face of the typical office of the 1950s and 1960s. For years, workers had labored in large, open spaces filled with rows of metal or wooden desks. Only those with executive status had the coveted private offices away from the hustle and bustle of typewriters, cigarette smoke and telephone calls that characterized a busy open office—and as a result of that privacy, they were more productive than the common worker bee.

There had to be a better way. For Propst, the answer was in the “Action Office,” an office layout that relied on lightweight sitting and standing desks and filing systems. Acoustical panels helped insulate workers from the noise of telephone calls and typing.

But American offices didn’t exactly take to the new Action Office plan. The components were more geared to individual workers than to large companies that needed to accommodate large numbers of people, and they were hard to put together. And the concept of a completely customizable workspace didn’t sit well with executives who didn’t value the individuality of their workers. Instead, they often purchased the furniture for their own homes rather than placing them in offices. Action Office was beloved by designers—and dismissed by just about everyone else.

Faced with the failure of his first concept, Propst went back to the drawing board and created the Action Office II. The new design took his acoustical panel concept to the next level. The panels became miniature walls of multiple heights that separated each space into its own office without completely cutting a worker off from colleagues. Lightweight and easier to assemble, it made more sense to executives.

But companies didn’t use the Action Office II, and the many knockoffs it inspired, the way Propst intended. Instead of going for roomy desk spaces with different designs and walls of different heights, they opted for tiny, boxed-in desks instead. They ignored Propst’s vision of a flexible workspace and visual sightlines. Using Propst’s brainchild, cubicles were used to cram even more workers into offices. The office he had invented shrank and shrank until it became impersonal and crowded. The age of the cubicle farm had begun.

Something else drove the rise of cubicles: the tax code. In the 1960s, it became easier to write off assets like furniture whose value depreciated over time. Office furniture no longer needed to last a lifetime to be worth buying, and companies quickly saw that it was cheaper to buy an Action Office II or a knockoff cubicle than to invest in sturdier equipment.

Meanwhile, Propst watched in horror as an invention intended to liberate American office workers was used to fence them in. “Not all organizations are intelligent and progressive,” he lamented. “Lots are run by crass people. They make little, bitty cubicles and stuff people in them. Barren, rathole places.”

And that was just the beginning. During the 1970s, the energy crisis prompted regulations that made buildings more energy efficient and airtight. This led the volatile organic compounds like formaldehyde released by the cubicles’ materials to linger in the air, and as a result, workers became sick. More workers meant more infectious disease, too, and the noise levels and lack of light of cubicle-heavy offices made workers less productive. Meanwhile, the average amount of floor space given to workers shrank.

Some of those issues—like formaldehyde—were addressed over the years with new materials. But even though the systems have been abandoned by some companies in favor of open floorplans, coworking and communal offices, Propst’s invention still boxes in about 30 percent of workers. And by the time he died in 2000, Propst had spent years apologizing for creating a corporate monster.


The history and evolution of the font Helvetica

Love it or hate it, Helvetica is one of the world’s most commonly used fonts, both in advertising and publishing and in urban signage. But to what does it owe its success and its widespread usage, and how has it changed over the years?

In this article we will start by looking at its invention way back in 1957, before going through the various milestones and restyles that have seen it become many international brands’ go-to typeface.

Helvetica’s origins

As its name suggests (based on ‘Helvetia’, the Latin word for ‘Switzerland’), Helvetica was created in Switzerland, when Eduard Hoffmann, director of the Haus foundry in Münchenstein, decided to commission freelance designer Max Alfons Miedinger to create a new font. His aim was to counter the success of Akzidenz Grotesk, the typeface launched by their competitors, the H. Berthold AG foundry.

In 1957, Miedinger came up with a new set of characters, which he named Neue Haas Grotesk. It was a sans serif font with a linear, simple and elegant design, and this no-frills look meant it was extremely legible.

Eduard Hoffmann’s notes documenting the creation of the font Neue Haas Grotesk. Copyright: http://www.fontbureau.com

Technically speaking, Neue Haas Grotesk had several interesting features: the negative (white) space surrounding the letters and the lines comprising the font were perfectly balanced, and the strokes were always horizontal or vertical, and never diagonal, creating a visual effect that was simultaneously bold and neutral.


New York: The First Punk Rock Scene

The first concrete punk rock scene appeared in the mid-'70s in New York. Bands like the Ramones, Wayne County, Johnny Thunders and the Heartbreakers, Blondie and the Talking Heads were playing regularly in the Bowery District, most notably at the legendary club CBGB.

The bands were unified by their location, camaraderie, and shared musical influences. They would all go on to develop their own styles and many would shift away from punk rock.

While the New York scene was reaching its heyday, punk was undergoing a separate creation story in London.


In the Middle

More, later, about the hate. Next: why do scholars of England’s Jews—historians, literary scholars, and art historians, Jewish and non-Jewish alike—all use state and church archives? Is this a plot to replace Jewish historiography with English historiography, as Pearce claims?

Everyone who studies Anglo-Jewry knows the answer, but Suzanne Bartlet and Patricia Skinner say it best: “Almost all that we know derives from sources produced by non-Jews, and much of what we know comes specifically from the judicial and fiscal records generated by England’s precociously bureaucratized government” ( Licoricia of Winchester , p.5). This is a finely exact description of the available archive.

Medieval England simply doesn’t have a rich plethora of Jewish-authored archives. Again, Bartlet and Skinner put it succinctly: “Ultimately, the voices we hear of Jews in medieval England are filtered through non-Jewish, and sometimes overtly hostile, sources” ( Licoricia of Winchester , p.10). In fact, England’s governmentality, surveillance, and control of its Jewish population, as demonstrated in those sources, are so intensive and totalizing, it’s not difficult to argue for England’s characterization as a racial state.[3]

Does Pearce know any of this? Apparently not. She’s not a scholar of Anglo-Jewry. Her specialization, and area of interest, is “the Hebrew and Arabic literature of Iberia.”

Invention of Race has won four book prizes, sold thousands of copies, and has been reviewed a dozen times, with virtually every reviewer possessing clear reviewing credentials for addressing the book—by virtue of their familiarity with critical theories of race, or previous scholarship on medieval race, or specialization in the countries, histories, and literatures treated in the book.

By contrast, Pearce, lacking credentials in critical theory, or scholarship on race, or knowledge of the histories, literatures, and cultures of medieval England (cf. Iberia), substitutes a long teaching vignette (six printed pages) to fill the lacuna, as her credential for reviewing a chapter on critical race in England.

Pearce’s vignette claims she teaches students about compassion . Yet, despite the vaunt about teaching compassion , the review is devoid of any of the usual professional respect common among academic colleagues, let alone compassion.

Instead, the tone of Pearce’s review is riddled with condescension, ridicule, and name-calling. My chapter is “a master class in how not to write” (p.154)—an insult most of us would not inflict on an 18-year-old attempting her first freshman composition, let alone address to a colleague. I am jeered at as an inquisitor (p.145, ff.) and likened to a magpie (p.177)—a thieving bird, we note, that steals shiny junk, a mean bird notorious for its loud, idle chatter.

Loftily, she stands as accuser, testifier, and judge: to bear “witness,” she says, to my magpie behavior with her “academic mesirah ” (p.181) that will “short-stop” the “neo-colonial, neo-Orientalist discourse” (p.182) that is my book.

So, Why the Hate? Critical Race Theory Today, and the European Middle Ages

Why would anyone produce a 46-page screed of such vitriol?

In addition to my not-whiteness, another difference separates me from the white scholars who have used the same medieval archive. The conceptual scaffold of my book, and its interpretive practices, are informed by a background in critical race theory (CRT)—or more accurately, critical race theories , since there is a spectrum of theories, and more than a single genealogy of critical scholarship on race.

In the past, critical race theories have maintained that race and racisms began only in the modern era—in tandem with, or resulting from, the rise of capitalism, or chattel slavery, or imperialism and colonialism, or class struggle and social war or bourgeois hegemony, the rise of nations and nationalisms, modern state apparatuses, globalization and transnationalism, or any number of other constitutive factors.

By contrast, Invention of Race (and my earlier publications) make/s a sustained argument for the existence of race and racisms in the deep European past, before the modern eras, and before there was a vocabulary of race to name racial phenomena, institutions, laws, and practices for what they were.[4]

The book is thus not only an intervention in scholarship on the European Middle Ages it is also an intervention in critical race scholarship. This is why Invention of Race is used today not only by medievalists, but also by modernists teaching critical theories of race.

For medievalists today, the subject of premodern race is sometimes confusing. Some are eager to enter the new conversations on medieval race. Others are genuinely puzzled about how the scholarship today differs from earlier scholarship. I list below some useful books to consult, but there is one simple, primary thing to remember.

The word critical , here, in the study of premodern race marks an important watershed—it marks the difference between the premodern race studies of the past, and the premodern critical race studies undertaken today. Critique is involved in the latter, but was often missing from the former.

Premodern critical race studies doesn’t just concern itself with marshaling descriptions of race, or compiling taxonomies of race, or producing summations of race (of the kind Pearce might approve, for instance), but sustains the critical analysis of race in the European Middle Ages.[5]

Critical race scholarship on premodernity analyzes the sources, institutions, infrastructures, practices, technologies, and dynamics of race and racialization, in order critically to assess their ethical, political, and epistemological consequences and impacts.

My old friend Margo Hendricks puts it her way, when she distinguishes premodern race studies (PRS) from premodern critical race studies (PCRS), in her keynote for the RaceB4Race conference at the Folger Institute last year , and a forthcoming article in New Literary History :

Premodern race studies, in my opinion, is fundamentally written by and for white academics….scholars whose publication history shows no attention to “race” have suddenly become experts….PRS assumes no foundational work to the study of race exists….If these scholars recognize the pre-existence of a cohort of Black, Brown, and Indigenous scholars working on the subject, this pre-existence is most often relegated to a footnote entry surrounded by whiteness. Or worse, this body of scholarship is ignored.

Or they might call for a scholar-of-color’s book to be canceled, as Pearce the new “expert” does. Thanks to Margo, and other colleagues of color, I’ve come to understand that this is what white privilege means—the right to bury, ignore, or cancel the work of scholars of color with impunity.[6]

No scholar of color, however senior, is immune from such treatment—especially if they perform “foundational work” in the critical study of race, as Margo Hendricks notes.

After all, we’ve seen critical race theory that is trained on our own time attacked by those occupying the highest rungs of political power: President Donald Trump in the US, and Ministers in the House of Commons in the UK . Given such attacks, why wouldn’t intellectual conservatives in the academy also seize the opportunity to attack the critical analysis of race in premodernity?

Of course they would, and they have. For instance: the people who run the Mediterranean Seminar made Pearce’s book review their October 2020 article of the month , claiming that the race analysis in my book is a “flattening” of religion and ethnicity praised her review offered Pearce more space for further words of attack and distributed it to their membership list.[7]

Early-career medievalists of color have also seen their work censored. An anonymous reviewer for a university press told an untenured medievalist of color—an assistant professor—that if she wants her book for tenure to be published, she must take out every mention of race. Another junior medievalist of color had her article on race rejected out of hand at a journal by yet another anonymous reviewer.

Such hostility to critical race analysis in premodern studies is vicious, but not surprising. Vicious, because nobody is demanding that every medievalist must work on critical race. The right to perform critical race scholarship doesn’t force every medievalist to undertake critical race analysis. All that’s asked for is the freedom to choose one’s own scholarship.

For medievalists who believe that to avoid critical race analysis is to sanitize what they see in their archive, surely the right to perform scholarship of their choice should not, ethically and politically, be denied them?

Must new forms of scholarship that many consider valuable and want to undertake be prevented, attacked, or censored, just to appease those who want to conserve some old ways— their old ways—of doing things? What gives them the right to place a stranglehold on the future?

What the Past Teaches Us about the Future: Feminism, Queer Theory, Critical Sexuality Studies

For those who want to undertake the new work on premodern race, here’s a reminder: feminism, queer theory, and critical sexuality studies also met with resistance in their early days. The hostility of those who want to conserve is thus not surprising: some may remember the harsh critiques levelled at John Boswell, when his Christianity, Social Tolerance, and Homosexuality: Gay People in Western Europe from the Beginning of the Christian Era to the Fourteenth Century first appeared in 1980, four decades ago. Four decades from now, critical race scholarship on medieval Europe may be as commonplace as queer theory, critical sexuality studies, and feminisms are today.

Because it really isn’t possible to turn the clock back. There are now six full-length monographs on race in the European Middle Ages, not to mention PhD dissertations.

More than enough in number for someone to write a real review essay on the landscape of critical medieval race studies today.[8]

Not to mention anthologies, articles, and essays, special issues of journals, and books in the pipeline at the new University of Pennsylvania Press series, RaceB4Race: Critical Studies of the Premodern .[9] Conference panels, workshops, symposia, and whole conferences on medieval and premodern race are increasing, not decreasing.[10]

For those who consider studying religion as a matrix of race-making to be a “flattening” of religion, there are excellent studies by scholars of color who serve up trenchant responses, such as Terence Keel’s Divine Variations: How Christian Thought Became Racial Science , and Willie James Jennings’ The Christian Imagination: Theology and the Origins of Race .

Nor is Christianity the only matrix for race-making. Michael Gomez offered richly layered arguments in his lecture on how Islamic sources on the Hamitic Curse, along with climate and zonal theories, enabled Arab and Persian authors of the 10 th to the 17 th centuries to racialize Black Saharan Africans and slavery in West Africa, in the Race in the Archives series organized for the Center for Medieval and Early Modern Studies at Stanford by CMEMS director Ali Yaycioglu.[11]

Critical work on medieval race is moving beyond Europe and Christendom, as more and more scholars, from distinguished senior academics of color like Mike Gomez to graduate students and early career researchers excited about new work, choose to be part of the collaborative process of co-building, and co-creating, new knowledges, new methods, and new ways of looking and thinking.[12]

It often takes a couple of decades in medieval studies to entrench paradigm-shifting work, but perhaps the paradigms will make way more quickly this time.

Therefore, to the early-career scholars—and others not-so-early in their careers—who are anxiously wondering if they, too, will be savaged if they undertake critical work on race if they will be prohibited from publishing in journals and by university presses and who fear a cancel culture initiated by hostile and powerful gate-keepers who are tenured, senior faculty, I say to you: there are some of us working hard to bend the arc of the intellectual universe slowly, but incrementally, toward greater freedom in academic publishing and academic intellectual life, so that you will have shelter and support.

The journey may be long—though I predict, this time, it won’t take a generation to entrench the new work—but you’ll see that the company is good.


Chindōgu: The Japanese Art of Unuseless Inventions

You have definitely seen a chindōgu. They are those ridiculous Japanese inventions designed to solve a particular problem but are, in fact, so clumsy and inelegant that they are an inconvenience to use, and generate a whole lot of new problems. A few examples of chindōgu are: chopsticks with a miniature electric fan to cool noodles on the way to the mouth glasses with attached funnels that allow the wearer to apply eye drops with accuracy tiny umbrellas attached to cameras to take picture in the rain a toilet plunger with a ring at one end that attaches to train-car ceilings and functions as a handrail in crowded carriages, and so on.

“Basically, chindogu is the same as the Industrial Revolution in Britain,” says Kenji Kawakami, who coined the term chindōgu, which means “weird tool” in Japanese. “The one big difference is that while most inventions are aimed at making life more convenient, chindogu have greater disadvantages than precursor products, so people can’t sell them. They’re invention dropouts.”

A 360-degree camera hat for taking panoramic pictures.

Chindōgus are not exactly useful, but somehow not altogether useless either, says Kawakami. He has another word for these silly little contraptions—”unuseless.”

Kawakami started inventing some 30 years ago when he was working as an editor for a popular home shopping magazine called Tsuhan Seikatsu, aimed towards countryside-dwelling housewives who liked to shop but found it too inconvenient to get to the cities where the stores were. In one of the issues, Kenji had a few extra pages at the end of the magazine and he decided to fill them with some of his bizarre prototypes that the readers couldn’t buy. The Eye Drop Funnel Glasses was one of the first product to feature in the magazine. Kawakami claims that he actually uses this tool to hydrate his eyes without the medicine rolling down his cheek. Another early chindōgu was the Solar-powered Flashlight with a huge solar panel that Kawakami built himself. Unlike those available in stores today, Kawakami’s flashlight didn’t come with rechargeable batteries that could be charged during the day and used at night. Instead, it needed full sunshine to function, which renders the flashlight useless.

His chindōgu were an instant hit, and as readers demanded more, Kawakami was forced to come up with new ideas for the entertainment of his readers. Over the years, he developed a set of rules—the 10 tenets—for proper chindōgu creation.

These ten commandments of chindōgu are as follows:

  1. A Chindōgu cannot be for real use — They must be, from a practical point of view, (almost) completely useless. “If you invent something which turns out to be so handy that you use it all the time, then you have failed to make a Chindogu,” it says.
  2. A Chindōgu must exist — A Chindōgu must be something that you can actually hold, even if you aren’t going to use it.
  3. There must be the spirit of anarchy — A chindogu must be an object that have broken free from the chains of usefulness. They represent freedom of thought and action.
  4. Chindōgu are tools for everyday life — Chindōgu must be useful (or useless) to everyone around the world for everyday life.
  5. Chindōgu are not for sale — Chindōgu cannot be sold. “If you accept money for one, you surrender your purity,” it says.
  6. Humor must be the sole reason for creating a chindōgu — The creation of Chindogu is fundamentally a problem-solving activity. Humor is simply the by-product of finding an elaborate or unconventional solution to a problem.
  7. Chindōgu is not propaganda — Chindōgu should be innocent. They should not be created as a perverse or ironic comment on the sorry state of mankind.
  8. Chindōgu are never taboo — Chindōgu must adhere to society’s basic standards.
  9. Chindōgu cannot be patented — Chindōgu cannot be copyrighted, patented, collected and owned.
  10. Chindōgu are without prejudice — Everyone should have an equal chance to enjoy every Chindōgu.

A baby romper that also functions as a mop.

According to a 2001 article on Japan Times, Kawakami has made over 600 chindogu since he began inventing. Yet he doesn’t own any patents and has never made a single yen by selling his creations (see tenet no#5 and #9).

“I despise materialism and how everything is turned into a commodity,” the 70-year-old inventor once said. “Things that should belong to everyone are patented and turned into private property. I’ve never registered a patent and I never will because the world of patents is dirty, full of greed and competition.”

However, this has not stopped others from stealing his ideas. One of his invention, a two-sided slippers can be bought at a well-known Japanese chain store. “Some people have no principles,” he says in disgust. “They’ll do anything for money.”

What started as a joke is now a form of art practiced by over 10,000 chindōgu practitioners all around the world.

Despite the seemingly universal appeal for his inventions and their purpose to amuse, Kawakami laments that sometimes he is not taken seriously.

“In Europe they treat me as an artist. In Australia and Canada, I’m called a scientist. In China and Hong they wonder why I don’t try to make money from my inventions. But in Japan and the US, they consider me a maker of party goods,” Kawakami bemoans.

This back-scratch guide t-shirt makes scratching your friend’s back easier.

This one is interesting. This is a selfie stick published in one of Kawakami’s books prompting the apparently useless device as a chindōgu. The book was published in the mid-90s, years before selfie stick became popular.


Why the Inventor of the Cubicle Came to Despise His Own Creation - HISTORY

NEW YORK (FORTUNE Magazine) - Robert Oppenheimer agonized over building the A-bomb. Alfred Nobel got queasy about creating dynamite. Robert Propst invented nothing so destructive. Yet before he died in 2000, he lamented his unwitting contribution to what he called "monolithic insanity."

Propst is the father of the cubicle. More than 30 years after he unleashed it on the world, we are still trying to get out of the box. The cubicle has been called many things in its long and terrible reign. But what it has lacked in beauty and amenity, it has made up for in crabgrass-like persistence.

Reviled by workers, demonized by designers, disowned by its very creator, it still claims the largest share of office furniture sales--$3 billion or so a year--and has outlived every "office of the future" meant to replace it. It is the Fidel Castro of office furniture.

So will the cubicle always be with us? Probably yes, though in recent years individuals and organizations have finally started to chart productive and economical ways to escape its tyranny.

The cubicle was not born evil, or even square. It began, in fact, as a beautiful vision. The year was 1968. Nixon won the presidency. The Beatles released The White Album. And home-furnishings company Herman Miller (Research) in Zeeland, Mich., launched the Action Office. It was the brainchild of Bob Propst, a Coloradan who had joined the company as director of research.

After years of prototyping and studying how people work, and vowing to improve on the open-bullpen office that dominated much of the 20th century, Propst designed a system he thought would increase productivity (hence the name Action Office). The young designer, who also worked on projects as varied as heart pumps and tree harvesters, theorized that productivity would rise if people could see more of their work spread out in front of them, not just stacked in an in-box.

The new system included plenty of work surfaces and display shelves partitions were a part of it, intended to provide privacy and places to pin up works in process. The Action Office even included varying desk levels to enable employees to work part of the time standing up, thereby encouraging blood flow and staving off exhaustion.

But inventions seldom obey the creator's intent. "The Action Office wasn't conceived to cram a lot of people into little space," says Joe Schwartz, Herman Miller's former marketing chief, who helped launch the system in 1968. "It was driven that way by economics."

Economics was the one thing Propst had failed to take into account. But it was also what triggered the cubicle's runaway success. Around the time the Action Office was born, a growing breed of white-collar workers, whose job titles fell between secretary and boss, was swelling the workforce. Also, real estate prices were rising, as was the cost of reconfiguring office buildings, making the physical office a drag on the corporate budget. Cubicles, or "systems furniture," as they are euphemistically called, offered a cheaper alternative for redoing the floorplan.

Another critical factor in the cubicle's rapid ascent was Uncle Sam. During the 1960s, to stimulate business spending, the Treasury created new rules for depreciating assets. The changes specified clearer ranges for depreciation and established a shorter life for furniture and equipment, vs. longer ranges assigned to buildings or leasehold improvements. (Today companies can depreciate office furniture in seven years, whereas permanent structures--that is, offices with walls--are assigned a 39.5-year rate.)

The upshot: A company could recover its costs quicker if it purchased cubes. When clients told Herman Miller of that unexpected benefit, it became a new selling point for the Action Office. After only two years on the market, sales soared. Competitors took notice.

That's when Propst's original vision began to fade. "They kept shrinking the Action Office until it became a cubicle," says Schwartz, now 80. As Steelcase, Knoll, and Haworth brought their versions to market, they figured out that what businesses wanted wasn't to give employees a holistic experience. The customers wanted a cheap way to pack workers in.

Propst's workstations were designed to be flexible, but in practice they were seldom altered or moved at all. Lined up in identical rows, they became the dystopian world that three academics described as "bright satanic offices" in a 1998 book, Workplaces of the Future.

Designer Douglas Ball, for instance, remembers the first installation of cubicles he created for a Canadian company in 1972. "I thought I'd be excited, but I came out depressed," says Ball, now 70. "It was Dilbertville. I'd failed to visualize what it would look like when there were so many of them."

Having taken over the world, the cubicle defeated several attempts to dethrone it. One of the most ambitious assaults came in 1993, when Jay Chiat, chairman of ad agency Chiat/Day, declared a sort of Bolshevik revolution when he moved his employees into newly renovated space in Venice, Calif. The design "was loungy, like Starbucks," remembers Stevan Alburty, then head of technology. "It was 20 years ahead of its time."

But it had a fatal flaw: No one had a fixed place to work. Employees were expected to park their belongings in lockers and check out laptops every morning as if renting a movie at Blockbuster. It quickly sparked a counter-rebellion--many employees simply stopped coming to the office, preferring to work at home. After the firm was acquired by an advertising conglomerate, employees got workspaces again.

Designers since then have mostly limited themselves to trying to offset the cubicle's most glaring defects. A recent Steelcase offering, the Personal Harbor, can be fitted with its own lighting system, fan, door, and window. Knoll offers the A3 (or anticube), a colony of rounded, podlike structures with translucent mesh coverings for privacy.

Herman Miller, now a $1.5-billion-a-year company, will launch two concepts in June that are the work of designer Ball. He says the new designs are the culmination of more than 30 years of trying to undo his early mistakes. The company won't release many details, but the systems will emphasize color and privacy Ball says the workstations will be "more capsule-like, or cockpit-like." In all, more than 100 cubicle-variant office systems have come to market over the past three decades.

When openly challenged, the cubicle still gets the last laugh. In California state-employed attorneys obtained relief from the cube through Title 13.3 of their union contract: "The State agrees to make a reasonable effort to provide private enclosed office space to each permanent full-time attorney who has confidentiality needs." Should an attorney be assigned to "other than enclosed private offices," the union must be notified. Rather than violate the rule, says union president Holly Wilkens, the state sticks some young attorneys in airless closets.

Is that really where we're headed? No, says Stewart Brand, co-founder of the Global Business Network, an Emeryville, Calif., consulting firm that helps companies make long-range plans. Back in the '60s, right around the time Propst unveiled the cubicle, Brand created The Whole Earth Catalog, which became the bible of environmentally aware living and arguably had a much more benevolent effect on American culture.

He says that the most productive people he knows have developed ways to work outside offices, not in them. Brand himself worked out of a converted shipping container in Sausalito for seven years and now commutes to a beached fishing boat a few yards from his house. He sees two workspaces rising up to compete with the modern office: homes and what might be called the third space--i.e., Starbucks.

A living example of work in the third space is Diego Guevara, a 23-year-old appraiser for a large mortgage and insurance company. He camped out on a recent winter afternoon in a Manhattan Starbucks before him on the counter were his ruggedized computer, a calculator, a Wi-Fi card, a yellow pad, and a grande.

As he used a PDA-cellphone to check on a property, Guevara typed details of the morning's appraisals into the laptop, which would sync the records with a database back at the office in West Orange, N.J. He usually goes to the office only when he needs supplies. And the last time he saw his boss? "Before Christmas," said Guevara, adding that his boss mostly works at home.

If working at home is now part of the zeitgeist, one very large employer that seems increasingly tapped in is the U.S. government. Congressman Frank Wolf, a Republican whose Virginia district is home to many federal worker bees, has made telecommuting his pet project. "There is nothing magic in strapping ourselves into a metal box every day only to drive to an office where we sit behind a desk working on a computer," he told a congressional committee.

Wolf sees telecommuting as a way to decrease traffic, reduce air pollution, increase productivity, and frustrate terrorists. In 2004 he launched a campaign to penalize government agencies by docking funds if they fail to support telecommuting. Now the SEC, the State Department, the Department of Justice, and four other big agencies are required to offer every eligible worker the opportunity to telecommute.

A 2005 survey by Milwaukee's Dieringer Research Group reported that 26 million Americans use broadband to do work from home. Sales reps and consultants have always worked remotely now finance people, lawyers, administrators, researchers, and creative types can too. Just as infotech has enabled companies to offshore white-collar functions, it also untethers Stateside employees from their cubes.

Coming to the office for meetings and in-person collaboration is still important, of course, but as Brand points out, "People are realizing they don't need face-to-face time all the time."

Remember how economics helped turn the cube into a plague? Now giants like Cisco Systems see "workforce mobility" as a way to cut real estate costs. Thanks to heavy use of mobile technology by employees, says real estate VP Mark Golan, "we discovered that Cisco offices and cubicles went vacant 35% of the time."

By switching to what it calls the Connected Workspace--employees set up work areas wherever they are needed in the building--Cisco says it has raised satisfaction while boosting density. Now 140 employees are able to work comfortably where 88 would work in a traditional workspace.

Hewlett-Packard, which has introduced a similar scheme, expects to cut $230 million out of annual occupancy expenses by mid 2007. The new economics of the office won't actually kill the cube. In fact, U.S. sales of office systems rose 11% in 2005. But as the office occupies a smaller part of companies' budgets, cubes will claim a smaller share of employees' lives.

REPORTER ASSOCIATES Doris Burke and Abrahm Lustgarten contributed to this article.

What Ben Franklin can teach you about time management -- set goals, forgive yourself for not meeting them, and don't drink rum all day. Read today's Plugged In.


The Surprising Reason Why Dr. John Harvey Kellogg Invented Corn Flakes

At the time, an Englishman writing home remarked on the huge breakfasts available at hotels. He could choose between breads, pastries, pancakes, fritters, boiled chickens, cold cuts, and beef steaks.

Not every American could eat extravagantly, but those that could chose big, meat-heavy fare. “Hot beefsteak,” Abigail Carroll writes in Three Square Meals , was “a dish without which a proper nineteenth-century middle-class breakfast was increasingly considered incomplete.”

As Americans binged on breakfast, it induced a national case of indigestion and an interest in lighter fare that led to the rise of America’s original health food: cereal.

Cereal would create fortunes and create multinational companies that we still know today. But Dr. John Harvey Kellogg , the inventor of corn flakes, did not care about profits. For him, cereal was not just a health food because it would improve Americans digestion. He believed a diet centered on bland foods like cereal would lead Americans away from sin. One very specific sin: masturbation.

For Dr. John Harvey Kellogg, his invention of corn flakes was part of his health movement that he called “biological living.”

The prompt for Dr. Kellogg’s health movement was a national case of digestion. “Americans wanted meat, meat, meat. And potatoes. And cake and pie,” Lowell Dyson writes of food preferences in 19th century America.” This was as true of breakfast as it was of dinner. Among the wealthy, steak and pie could be dinner or breakfast.

The results for the nation’s health were not good. Indigestion was endemic. As Abigail Carroll, author of Three Square Meals , has explained , Americans called this indigestion “dyspepsia.” Discussion of dyspepsia was like today’s obesity debates, endlessly written about in magazines and newspapers.

For a number of health reformers, the solution was to create simpler foods. The graham cracker was invented by a dietary reformer named Sylvester Graham in 1827. In 1863, James Caleb Jackson, who ran a health resort, invented the first cereal, which he called “granula.”

Dr. John Harvey Kellogg also ran a health resort, where he treated diseases and ailment with novel ideas like “hydrotherapy” (essentially baths at different temperatures). Dr. Kellogg was a vegetarian, and with the assistance of his brother William Kellogg, he created or invented foods like peanut butter and meatless meats for his patients.

Corn flakes, which he first designed in the 1890s, were his most enduring legacy.

Few people today would eat Kellogg’s corn flakes or Jackson’s granula. They had no sugar or added flavors, and they were so tough that they often cracked people’s teeth.

But in the 1900s, people desperately wanted cereal, and they bought as much cereal as Dr. Kellogg’s health facility could produce. It was an opportunity for Dr. Kellogg to spread his gospel of biologic living.

In dense books and popular lectures, John Harvey Kellogg explained the merits of bland foods like cereal. Writing of Americans’ tendency to eat “with the feeble stomach of a primate” seemingly every kind of food, including new, “artificial foods,” he concluded that "it is no wonder that the human gastric machine has broken down, and that dyspepsia, constipation, and peristaltic woes of various description have become universal in civilized lands.”

Dr. Kellogg’s “biologic living” called for more exercise, more bathing, and eating whole grains and less meat. Like with today’s paleo or organic food trends, he portrayed this as a scientific return to natural principles. “To eat biologically,” he wrote , “is simply to eat scientifically, to eat normally.”

Unlike today’s food trends, he also believed that man’s modern diets led them to carnal sins. "Highly seasoned [meats], stimulating sauces. and dainty tidbits in endless variety,” Kellogg wrote , “irritate [the] nerves and… react upon the sexual organs.” Dr. Kellogg wrote as much about the dangers of sex and masturbation as he did about healthy living. Cereal was the bridge the dietetic remedy to keep Americans’ diets from leading them to sin.

Despite creating a product, corn flakes, that launched a food craze, Dr. Kellogg cared more about this cause than profits. In his lectures, he explained how people could make cereal at home.

“I am not after the business,” he told people. “I am after the reform."

The cereal business quickly got away from Dr. John Harvey Kellogg.

Although Dr. Kellogg attempted to protect his invention with a patent, businessmen quickly realized that they could produce cereal without infringing upon it. Hundreds of companies sprung up near Kellogg’s Michigan health facility—a fact that Dr. Kellogg took personally. After all, of the two most successful cereal companies, one was created by a former patient and the other was founded by Dr. Kellogg’s brother William.

William Kellogg founded the Kellogg Company, and the former patient, C.W. Post, created and sold Grape Nuts.

They succeeded by doing something that Dr. Kellogg despised: adding sugar. The idea had long been a point of contention between Dr. Kellogg and William Kellogg. William believed they needed to make corn flakes taste better, while Dr. Kellogg saw sugar as corrupting his health food. But by the 1940s, all the major cereal companies pre-coated their cereals with sugar.

The other reason cereal succeeded had nothing to do with health. It was the ultimate convenience food, and as Abigail Carroll, author of Three Square Meals , notes , this made it especially appealing across the world as the Industrial Revolution led more and more people to leave farms and take work as employees. They had less time and less access to a kitchen, which made cereal and “ready to eat” breakfasts appealing.

Cereal left a huge mark on the food industry. William Kellogg and C.W. Post were advertising pioneers, spending unheard of sums to advertise their cereal brands and creating some of the first cartoon mascots for their cereals. When C.W. Post died, he had a net worth (in 2016 dollars) of around $800 million.

The motivations of Dr. Kellogg have not gone away. You can see his idea of biologic living mirrored in health trends like the organic movement and paleo dieting, which are in many ways a backlash to the processed food industry that cereal helped create.

Thankfully Dr. Kellogg’s views on how diet influences our sex lives hasn’t seen the same revival.

This article was written by Alex Mayyasi, a Priceonomics Staff Writer. You can read more about the invention of cereal and the history of breakfast here.


Summer Series - The Frankenstein Factor: Inventors Who Regret Their Inventions

This week, we analyze inventors who later came to regret their inventions. Sometimes it's because the product ended up being harmful. Other times it's because of the way their product was used. And in most cases, the creators simply lost control of their creations. We'll look at why the inventor of the K-Cup doesn't own a Keurig machine, why the creator of Mother's Day later tried to have it rescinded and how the Wright Brothers lost control of the airplane. It's one of the most unwieldy aspects of marketing - you create a product, you inform the public, you put it into the marketplace, and it's out of your hands.

This little known movie, released in 1969, changed a fundamental aspect of Hollywood.

Star Richard Widmark wasn't getting along with director Robert Totten and arranged to have him replaced with director Don Siegel.

Both directors claimed Widmark had overruled their decisions, and neither director was happy with the final film.

More important, neither director wanted to take credit for it.

A Director's Guild meeting overseeing the dispute agreed that the film did not represent the vision of either director.

So a proposal was tabled: The directing credit was to be changed to protect the reputations of the filmmakers. Instead of using their real names, a fictional name would be used. The name Al Smith was suggested. But it was discovered there was actually a director named Al Smith, so the Director's Guild settled on "Alan Smithee."

From that point on, whenever a director had lost creative control of a finished film, he could file a grievance, take his name off the film, and "Alan Smithee" would be credited instead.

So within Hollywood, whenever a director's credit said Alan Smithee, it was instantly understood the original director had disavowed the film.

If you search the Internet Movie Data Base, you'll find over 20 very bad Alan Smithee films.

In 1998, director Arthur Hiller shot a mockumentary on this very subject called An Alan Smithee Film: Burn Hollywood Burn.

The plot revolves around a director named Alan Smithee who directs a film starring Sylvester Stallone. The studio eventually takes control of the film away from Smithee and re-edits it.

Smithee wants to disown the film, and when he tries to take his name off the movie, he discovers that his name is the same pseudonym the Director's Guild uses when a director wants to take his name off a movie. So he has no option but to steal the film and burn it.

But get a load of this: The director of this mockumentary, Arthur Hiller, didn't get along with the producer on the film. The film was taken away from Hiller and re-edited. It was art imitating life imitating art.

So guess what Arthur Hiller did? He took his name off the film.

Which meant that An Alan Smithee Film – in the end - was directed by Alan Smithee.

Believe it or not, there are quite a few Alan Smithee's in the world of business, too.

Inventors and business people who created products they later tried to distance themselves from.

Sometimes it's because the product ended up being harmful. Other times it was because of the way their product was used, and in most cases, the creators simply lost control of their creations.

And it just may surprise you to learn what those inventions are…

Architect Victor Gruen had an interesting idea.

As cities started expanding to the suburbs, he wanted to create a place where shoppers could run errands without the drawbacks of driving downtown.

He wanted to model these communal areas like the old town squares of yesteryear with promenades, green spaces, fountains, supermarkets, schools and post offices. He prioritized pedestrians over cars.

Gruen's creation became known as… the Shopping Mall.

The first one Gruen designed was in suburban Detroit in 1954.

It caught on, and Gruen quickly became one of the busiest architects in the country.

But other cities took Gruen's idea and began twisting it into something he hated and opposed. They took out the green spaces, enclosed the malls, packed them with stores and surrounded them with seas of asphalt parking.

Over time, Gruen went from being the shopping mall's inventor to its most vocal critic.

He called them harmful, hideous, soulless shopping machines that alienated people instead of bringing them together.

The "father of the shopping mall" refused to claim paternity.

To his dying day, Victor Gruen despised what became of his invention.

He wouldn't be the first inventor to feel that way.

One day back in 1995, John Sylvan was sitting in his car outside an ATM when he started feeling ill.

He began experiencing tunnel vision.

He suspected he was having a heart attack.

So he rushed to the nearest hospital.

In the Emergency Room, doctors did a number of tests on Sylvan and determined he wasn't having a heart attack.

So they began asking him questions.

Are you sleeping well? Are you eating properly? Are you exercising?

Then they casually asked him how many cups of coffee he drank a day.

Sylvan answered, "Around 30 or 40."

The doctors just stared at him.

37-year old John Sylvan was suffering from caffeine poisoning.

But you have to understand something.

Caffeine poisoning was an occupational hazard.

For the three years leading up to that hospital visit, John Sylvan had been trying to revolutionize coffee making.

Previously, Sylvan had been working a low-level job at a tech firm in Massachusetts. Part of the job entailed going around collecting money from his co-workers for the office coffee fund.

More than that, he hated the office coffee. Everyone did.

And the coffee vendors not only delivered bad coffee every week, they had a monopoly on the office market.

Every day, bad coffee would sit in the pot, growing stale and cold.

As coffee companies will tell you, the biggest consumer of coffee… is the kitchen sink.

So Sylvan had an idea to create single-serve coffee pods. That way, people could brew one cup of coffee of their choosing. Coffee and water wouldn't be wasted. All he needed to do was invent a machine that could brew single cups.

First, Sylvan created single coffee pods, then tried prototype after prototype of coffee machines to brew them. Many of them exploded, plastering his kitchen with coffee grounds. Sylvan was also the official coffee taster – hence the 40 cups per day.

When he finally managed to create a semi-reliable brewing machine, Sylvan christened the company Keurig - which was a Dutch word for "excellence."

The coffee pods were to be called K-Cups.

When he started looking for investors, no one was interested. As a matter of fact, major coffee companies told him his invention would never catch on.

But Sylvan believed in the potential of single-serve coffee pods. Even if he just managed to capture a fraction of the $40 billion coffee market, it would mean untold millions.

And Sylvan had his eye on the office market.

Eventually, Keurig found investors.

The plan was to make inexpensive coffee brewing machines.

The real money was in the K-Cups.

Early Keurig machines kept breaking down. But an interesting thing happened – when the machines broke, office workers would beg for a replacement.

The convenience was catching on – and catching on in a big way.

Sales started to explode. But so did the relationship between Sylvan and his investors. It got so bad that Sylvan left the company in 1997, selling his shares for just $50,000.

By 2010, Keurig was on track to sell 3 million K-Cups.

By 2014, that number jumped to 9.8 billion.

The reason: The company had cracked the home market.

But even though his idea became a multi-billion dollar operation, Sylvan doesn't look back with pride.

The problem: Those 9 billion K-Cups aren't biodegradable and can't be recycled.

Founder John Sylvan never imagined K-Cups would be used outside offices.

But today, 40% of Canadian homes and 25% of American ones have single-serve coffee makers in their kitchens.

Recent estimates say the amount of non-recyclable K-Cups currently in landfills could circle the Earth more than 12 times.

And that's why John Sylvan regrets his invention.

While Keurig says it is working on a sustainable K-Cup, Sylvan doesn't believe the product will ever be fully recyclable.

He says he feels bad sometimes that he ever invented it.

And today, John Sylvan doesn't even own a Keurig coffee maker.

Back in the late 1800s, Milton Wright was a travelling preacher.

He would often bring home toys for his children from his travels.

One day, he brought home a toy whirlybird. Made of cork, bamboo and paper, the whirlybird was powered by a rubber band, which twirled its blades and made it airborne.

It fascinated his two sons… Orville and Wilbur.

When they grew up, Orville and Wilbur started a bicycle repair shop and began coming up with their own designs.

The Wright brothers were tinkerers.

But they never lost their fascination with flight.

Around the world, some other inventors were having moderate success with gliders.

That's when the Wright brothers decided to experiment with motorized flight.

Through many prototypes and designs, the Wright brothers continued to refine their idea.

Then, on December 17th, 1903, Orville and Wilbur Wright made history with the first powered, sustained and controlled airplane flight, remaining airborne for 59 seconds at an altitude of 852 feet.

It was an extraordinary achievement.

Surprisingly, their invention didn't find a receptive audience in the U.S. Many people didn't believe the accomplishment. The press said the flights were too short to be important. One headline said, "Flyers or Liars?"

So Wilbur travelled to Europe and found a much more receptive audience there. Almost immediately, they started selling planes in Europe.

Eventually, the Wright brothers sold their first airplane to the U.S. army in 1909. Even though Wilbur died in 1912, Orville continued with the company. He sold 14 more planes to the army for "observation" missions.

Orville truly believed airplanes would prevent wars. He felt with aerial observation, it would be impossible to have surprise attacks. And because both sides would know what the other was doing at all times, the desire for war would wane.

However, the military had other ideas.

In 1911, Italy became the first country to use airplanes in warfare. It was in a war with Turkey and dropped hand grenades on enemy troops from the sky.

While it's kind of shocking to imagine, early aerial dogfights were really pistol duels.

Pilots actually carried handguns and rifles to try and shoot other pilots.

In one noted encounter in 1914, a British airman ran out of ammo and simply threw the handgun at a German pilot.

By the end of WW1, there were observation planes, fighter planes and multi-engine bombers that could carry thousands of pounds of bombs.

Orville Wright was mortified at the destruction his beloved planes were inflicting.

During World War II, over 300,000 warplanes were built.

On his 74th birthday in 1945, Orville Wright's life-long optimism about the role of the airplane as an instrument of peace had faded.

While he loved his invention, he deplored the destruction it had caused.

He said: "We dared to hope we had invented something that would bring lasting peace to the Earth. But we were wrong. We underestimated man's capacity to hate and to corrupt good means for an evil end."

It would be a sentiment shared by quite a few other inventors in history…

Even as a kid, Philo Farnsworth was fascinated by electricity.

At 13 years of age, he figured out how to use electricity to operate his farm's washing machine, sewing machine and barn lights.

One day, he found a stash of Popular Science magazines in the attic of his family's Idaho farmhouse, and read about the possibility of television for the first time. The thought of sending pictures through the air enthralled him.

By age 14, he had theorized the principles of electronic television.

Everyone in his family had farm chores, and young Philo's was to plow the family potato field - which gave him a lot of time to think.

One day, he stopped to survey the parallel rows of crops behind him. In that moment, he realized that a large image could be composed from smaller repeating lines if they were viewed from a distance.

It was a profound insight.

He noodled that insight for the next few years, and in 1927 – at the age of 21 - he generated the first electronic television image through the air – from one room to another – by scanning the image in a series of lines going back and forth. A breakthrough inspired by his potato plowing.

Television as we know it was born that day.

Philo Farnsworth had a hope for his invention.

He saw television as a marvellous teaching tool that could help eliminate illiteracy. He wanted it to allow people to see and learn about each other. That way, differences could be solved around conference tables without going to war.

But it didn't turn out that way.

When Philo Farnsworth looked back at his invention many years later, he wasn't a happy man.

He felt he had created a monster.

He believed very few people were being educated.

That the world's problems had not been solved.

He believed people wasted their lives spending so much time watching television because there was nothing worthwhile on it. He regretted his wonderful invention.

Philo Farnsworth lived until 1971. When he died, the average TV set still contained over 100 components he had patented.

By that time, almost every house in the nation had a television set.

Except one. Philo Farnsworth never allowed a TV set into his home.

When WW2 ended, industry boomed in North America.

With that came expanding workforces.

Most office spaces at that time were open bullpens, with only executives enjoying offices with doors.

Industrial designer Robert Propst felt the open concept office was a wasteland. He believed it sapped vitality, blocked talent and wasted effectiveness, health and motivation.

So in 1968, he offered a better solution.

He came up with a flexible, three-walled design that could be re-shaped to any given need. It included multiple work surfaces, and the moveable partitions provided a degree of privacy with a place to pin up works in progress. It let companies react to change quickly and inexpensively.

He called his new design "Action Office."

The world called it cubicles.

Initially, cubicles launched to great reviews. People who had worked in noisy, open areas welcomed the change.

But that applause didn't last long.

Soon companies looking to save money began cramming a lot of people into small spaces.

The cubicles got smaller and smaller… and smaller.

Robert Propst didn't like what he saw. First, cubicles were never designed to be square. They were meant to be fluid and interesting.

Secondly, his movable walls were designed to be raw material to be built on. But office managers saw them as finished furniture.

Where the Action Office was meant to be shapeshifting, motivating and inspiring, cubicles ended up being boxy, boring and soulless.

Propst was outraged. He said the "cubicle-izing of people in modern corporations was monolithic insanity."

He said the egg-carton geometry created "barren hellholes" and a "rat maze of boxes."

Even though it was hated by workers and cursed by interior designers, the cubicle still claims the largest share of office furniture to this day.

By the time Robert Propst died in 2000, over 40 million people were working in cubicles.

It would be the biggest regret of his career.

One day at Sunday School, Anna Jarvis's mother told stories about notable mothers in the bible, ending the lesson with a prayer that maybe someday someone would create a day to celebrate all that mothers have done for humanity.

That lesson had a profound impact on Anna.

When her mother passed away years later, Anna Jarvis was devastated, and decided to work to promote a day that would honour all mothers.

In 1908, Anna celebrated the first Mother's Day with a speech in the church where her mother had taught. She designated white carnations as a symbol of a mother's love, as carnations were her mother's favourite flower.

The concept of Mother's Day caught on quickly because Jarvis was a zealous letter writer. She wrote to the President, she wrote to politicians, she wrote to dignitaries.

She was soon assisted by deep-pocketed backers like John Wanamaker of Wanamaker's Department Store, and H.J. Heinz of ketchup fame.

The floral industry fully supported the movement, and Anna Jarvis accepted their donations and spoke at their conventions.

In 1914, President Woodrow Wilson signed legislation officially designating the second Sunday in May as Mother's Day.

Anna Jarvis had finally realized her dream.

But that dream started becoming a cash cow for corporations.

In the beginning, carnations cost half-a-penny each. Four years later florists were charging 15 cents each.

Greeting card companies started issuing Mother's Day cards. The confectionary industry began creating Mother's Day chocolates.

Soon, Anna Jarvis quit her job as the first female advertising editor at an insurance company to campaign full time against the commercialisation of Mother's Day.

To her, Mother's Day was to be a day of sentiment.

She encouraged people to spend the day with their mothers or write them loving letters.

Now all she saw was profiteering.

Beginning in 1920, she urged people to stop buying flowers. She couldn't stand those who sold or used greeting cards.

She turned against her commercial supporters. One day while dining in Wanamaker's department store, when she saw they were offering a "Mother's Day Salad." She ordered it, dumped it on the floor…

…left the money for it and marched out.

She threatened lawsuits. She tried to trademark a carnation with the words "Mother's Day" but was denied.

Jarvis referred to florists, greeting card companies and candy makers as "charlatans, bandits, pirates, racketeers, kidnappers and termites that would undermine with their greed one of the finest and noblest of celebrations."

FTD, the floral company, offered her a lucrative commission on the sale of all Mother's Day carnations as a peace offering – which only infuriated her further.

She spent the next years going door-to-door for signatures to rescind Mother's Day.

Older, worn and frail from the long fight, Anna Jarvis spent her last days deeply in debt, living in a sanatorium.

She regretted the commercialisation until the day she died in 1948.

Anna Jarvis was the mother of Mother's Day, but never married and never became a mother.

And she was never told one interesting fact:

The bill for her time in the sanatorium was paid for by a group of grateful florists.

When directors lost control of their films, they were able to take their name off the credits, use the Alan Smithee pseudonym, and walk away anonymously.

But inventors rarely get that option.

Victor Gruen's shopping mall became a suburban cliche. Orville and Wilbur Wright's invention has become a big chapter in military history. Philo Farnsworth's invention was often referred to as an idiot box. Robert Propst's cubicles have been called "satanic offices." And Anna Jarvis's beloved Mother's Day has turned into a $21 billion dollar sales frenzy.

That was the consistent theme today: Each inventor lost control of their creations. And the way their inventions went on to be used and misconstrued broke their hearts.

That's one of the most unwieldy aspects of marketing. You create a product, you inform the public, you put it into the marketplace, and it's out of your hands.

The world will do with it what the world wants. As they say, "The herd will be heard."

It makes you wonder what kind of world it might have been if only we had listened to those inventors. But it's hard to hear them…


Contents

The original showers were neither indoor structures nor man-made but were common natural formations: waterfalls. [3] The falling water rinsed the bathers completely clean and was more efficient than bathing in a traditional basin, which required manual transport of both fresh and waste water. Ancient people began to reproduce these natural phenomena by pouring jugs of water, often very cold, over themselves after washing. There has been evidence of early upper class Egyptian and Mesopotamians having indoor shower rooms where servants would bathe them in the privacy of their own homes. [4] However, these were rudimentary by modern standards, having rudimentary drainage systems and water was carried, not pumped, into the room. The ancient Greeks were the first people to have showers. Their aqueducts and sewage systems made of lead pipes allowed water to be pumped both into and out of large communal shower rooms used by elites and common citizens alike. [5] These rooms have been discovered at the site of the city Pergamum and can also be found represented in pottery of the era. The depictions are very similar to modern locker room showers, and even included bars to hang up clothing. [6] [ page needed ] The ancient Romans also followed this convention their famous bathhouses (Thermae) can be found all around the Mediterranean and as far out as modern-day England. The Romans not only had these showers but also believed in bathing multiple times a week, if not every day. The water and sewage systems developed by the Greeks and Romans broke down and fell out of use after the fall of the Roman Empire.

Modern showers

The first mechanical shower, operated by a hand pump, was patented in England in 1767 by William Feetham, [ citation needed ] a stove maker from Ludgate Hill in London. His shower contraption used a pump to force the water into a vessel above the user's head and a chain would then be pulled to release the water from the vessel. Although the system dispensed with the servant labour of filling up and pouring out buckets of water, the showers failed to catch on with the rich as a method for piping hot water through the system was not available. The system would also recycle the same dirty water through every cycle.

This early start was greatly improved in the anonymously invented English Regency shower design of circa 1810 (there is some ambiguity among the sources). [3] The original design was over 10 feet (3 m) tall, and was made of several metal pipes painted to look like bamboo. A basin suspended above the pipes fed water into a nozzle that distributed the water over the user's shoulders. The water on the ground was drained and pumped back through the pipes into the basin, where the cycle would repeat itself. [ citation needed ] The original prototype was steadily improved upon in the following decades until it began to approximate the shower of today in its mode of operation. Hand-pumped models became fashionable at one point as well as the use of adjustable sprayers for different water flow. The reinvention of reliable indoor plumbing around 1850 [7] allowed free-standing showers to be connected to a running water source, supplying a renewable flow of water. [ citation needed ] Modern showers were installed in the barracks of the French army in the 1870s as an economic hygiene measure, under the guidance of François Merry Delabost, a French doctor and inventor. [8] As surgeon-general at Bonne Nouvelle prison in Rouen, Delabost had previously replaced individual baths with mandatory communal showers for use by prisoners, arguing that they were more economical and hygienic. [9] First six, then eight shower stalls were installed. The water was heated by a steam engine and in less than five minutes, up to eight prisoners could wash simultaneously with only twenty liters of water. The French system of communal showers was adopted by other armies, the first being that of Prussia in 1879, and by prisons in other jurisdictions. They were also adopted by boarding schools, before being installed in public bathhouses. The first shower in a public bathhouse was in 1887 in Vienna, Austria. In France, public bathhouses and showers were established by Charles Cazalet, firstly in Bordeaux in 1893 and then in Paris in 1899. [10]

Domestic

Domestic showers are most commonly stall showers or showers over a bathtub. A stall shower is a dedicated shower area which uses a door or curtain to contain water spray. The shower over a bathtub saves bathroom space and enables the area to be used for either a bath or a shower and commonly uses a sliding shower curtain to contain the water spray. Showers may also be in a wet room, in which there is no contained shower area, or in a dedicated shower room, which does not require containment of water spray. Most domestic showers have a single overhead shower head, which may be adjustable.

Public

Many modern athletic and aquatic facilities provide showers for use by patrons, commonly in gender segregated changing rooms. These can be in the form of individual stalls shielded by curtains or a door or communal shower rooms. The latter are generally large open rooms with any number of shower heads installed either directly into the walls or on posts throughout the shower area. Open showers are often provided at public swimming pools and at popular beaches. Military forces around the world set up field showers to enable the washing away of dangerous residue from modern weapons such as caustic chemicals, deadly biological agents, and radioactive materials, which can harm forces on both sides of a conflict. [11]


The Woman Who Invented the Bag That Keeps Your Pizza Warm

Ingrid Kosar always dreamed about running her own business. She didn't know what kind of company it would be, but she liked to picture herself carrying a little briefcase. As it turns out, a very different kind of bag would define her career. It's a bag that appears on doorsteps millions of times a week for Friday family movie nights and college study sessions.

It's the insulated pizza delivery bag, and Ingrid Kosar invented it .

"There is a need in certain fields for soft-sided thermally insulated carrying cases," reads Kosar's patent from 1984, one of three that she earned. "One particular application is the ready-to-eat pizza field, where hot pies are packaged in rectangular cardboard cartons and delivered several cartons at a time to customers' homes or offices."

Being the first to patent an insulated bag for pizza secured Kosar's place in the history of a beloved American food, but it didn't shield her from all the challenges to come, including a wave of cheaper imported bags and a protracted slowdown in sales during the latest recession. Today, Kosar, 65, is hunkered down in a small office 45 miles northwest of Chicago with a small group of longtime employees and her dog, Duke, planning the comeback of Thermal Bags by Ingrid. Her story is one of survival, adaptation and a business owner making what's probably her last stand before retirement.

" I've made all the mistakes, but somehow we've survived them. I don't know how exactly, but I think it's just sheer willpower," says Kosar. "Bill [her late business partner] and I used to say — and this is something terrible to say, but we used to say we're too stupid to quit. We just can't. But it's never been an easy thing because when somebody buys something and they don't buy it for another four or five years, it's easy to get disheartened or to have a very difficult cash flow….You get a big order and then you're like, 'Okay, where's the next one?' And you just have to keep really beating the bushes."

Kosar's signature bags, which are sewn in the company's workshop by a husband and wife team, sandwich a layer of food-safe polyester insulation between 1,000-denier Cordura nylon on the outside and a slightly thinner nylon interior lining. She has added other features over the years, like a see-through pocket on the outside to hold a restaurant ticket, a reflective safety strip, and an underside wrist strap that allows the delivery person to hold the bag like a tray. The bags are built to withstand several years of spills, leaks, and getting bounced around in cars.

The thermal bag's three layers – exterior nylon, polyester insulation and interior nylon – are glued together before being cut to size.

"She was domestic and her bags were always 25 percent higher than anywhere else, but she was quality," says "Big Dave" Ostrander, a pizza industry consultant who owned a pizzeria in Oscoda, Mich. for 25 years. "She was like Craftsman tools… guaranteed but expensive. Weɽ buy one or two Ingrid bags and weɽ buy six or seven cheapies for the price of three of hers."

The expiration of Kosar's three patents in the early 2000s eased the way for lower-priced products and Chinese imports. Recognizing that price-conscious restaurants didn't want to spend top dollar on bags that get left behind on porches or repurposed as beer caddies by drivers, Kosar started importing Chinese-made products about 15 years ago. They're sold alongside her own bags at a substantially lower price. An imported sleeve that can fit two 16-inch pizzas starts at $16.99, while one with the same capacity from Kosar's shop is $58.99.

Some of Kosar's most loyal customers are happy to pay the premium. Bob Beyer, the general manager of RiverView Restaurant & Tavern in Algonquin, Ill., discovered Thermal Bags by Ingrid at a pizza trade show and estimates that he's used Kosar's bags for all but two of his restaurant's 25 years. In his experience, cheaper fabrics crack in cold weather and don't retain heat.

"We run eight to 12 drivers on a Friday or Saturday night, and sometimes they leave them on top of the car and go around the block," says Beyer, whose restaurant delivers 500 to 600 pizzas a week in addition to steak, ribs, chicken and other items. "They find them! You have to clean them off, but they're still good."

Bags to Boxes to Boxes and Bags

Growing up in the Chicago suburb of Des Plaines, Kosar looked up to her parents, who were both small business owners. Her father was a certified public accountant who ran his own firm before passing away when she was young her mother kept the building and turned it into a beauty salon. Kosar's own entrepreneurial journey started at a craft fair in the early 1980s, when she spotted a "little bag made out of cotton with some padding and stuff for a lunch bag." At that time, she was a buyer for a company that made steel equipment like shafts for oil rigs, and she had no experience with the pizza or restaurant industry, other than as a consumer.

"I just thought, well, I hate getting it cold," Kosar recalls. "I don't know, I was at that age when I ate a lot of pizza."

At the time, pizzerias were relying on corrugated cardboard boxes to keep their delivery pies warm. Those boxes were already an improvement over the industry's earliest mode of transport, which was a cardboard cake circle slid into a paper bag. Restaurants later adopted paperboard bakery boxes, but these containers were designed to hold room-temperature pastries rather than steaming hot pizzas, and they couldn't be stacked without collapsing.

Domino's Pizza, which was founded in Michigan in 1960 and last year commanded a 24 percent share of the U.S. pizza delivery market, is credited for standardizing the modern pizza box. The new cartons, while sturdier and stackable, still weren't great at keeping pizzas hot and crispy during deliveries.

"Pizza is bread, but it's also moist things like tomatoes and toppings and cheese," says Scott Wiener, a pizza box collector who authored the book "Viva La Pizza! The Art of the Pizza Box" and leads pizza tours in New York City. "You have this issue of extra moisture falling onto crust, which is usually pretty crunchy, but it will soften the crust. And that's the big issue that happens with boxes….Since it's made of paper, there's only so much heat you can trap."

Pizzerias tried boosting heat retention in a variety of ways, including wrapping the boxes in blankets. Ostrander, who delivered pizzas as a high school student, remembers using a metal heater box the size of a mini refrigerator that was outfitted with shelves and a lit can of Sterno, and placed somewhat precariously in the back seat.

"You've got an open flame, yeah," Ostrander says. "We were told, 'If you hit the brakes real hard, it could get really ugly. If you hit a telephone pole, bail.' That was about the extent of our training."