An average American child eats about 1,500 peanut butter and jelly sandwiches prior to graduating from high school. That is about a sandwich every four or five days. Americans eat a lot of peanut butter. Besides it being popular and delicious, peanut butter also has had a tremendous impact on how foods are made and labeled today. Thanks to the “Peanut Butter Hearings,” we can now be reasonably sure what we think we are eating is actually what we are eating.
Contrary to popular belief, peanut butter was not invented by George Washington Carver. For instance, around the 14th and 15th centuries, the Aztecs of Mexico made peanut paste by mashing up roasted peanuts, and its possible peanut butter pre-dates this. More recently, while Carver developed innovative ways to cultivate, use, and grow peanuts (among numerous other innovations in various areas), it was actually Canadian and Montreal-native Marcellus Gilmore Edson who applied for US patent 306727 in 1884, when Carver was about 20 years old. The patent described a process of milling roasted peanuts until the peanuts reached “a fluid or semi-fluid state” to form a “flavoring paste from said peanuts.” In other words, peanut butter.
In 1898, John Kellogg (Yes, of cereal fame) received a patent for a “process of producing alimentary products” in which he improved this food item by using boiled peanuts (instead of roasted) that turned the paste into the same consistency as “hard butter or soft cheese.” Kellogg thought so highly of his new product that he served it to the residents of his religious healthy-living sanctuary, and somewhat infamous, Battle Creek Sanitarium.
It was at C.H. Sumner’s concession stand at the 1904 World Fair in St. Louis when peanut butter was first introduced to a mass audience. He, apparently, sold over $700 of peanut butter there (about $18,000 today), while also selling other recently introduced (at least to a world-wide audience) foods like hot dogs in buns and ice cream in cones. Krema Products in Columbus, Ohio began mass producing peanut butter in 1908 and, to this day, is the oldest peanut butter manufacturing company still around. By the time peanut butter and jelly sandwiches became a mainstay in the American soldiers’ diet during World War II (see: The Surprisingly Short History of the Peanut Butter and Jelly Sandwich), peanut butter had already became a staple in every American’s kitchen cabinet.
The United States government first became (at least, officially) concerned about what was put into their citizens’ food way back in 1862 when President Abraham Lincoln appointed Charles M. Wetherill to oversee the chemical division of the newly formed Department of Agriculture. A noted chemist himself, Wetherill’s first project was “a chemical study of grape juice for winemaking,” to decide if adding sugar to increase alcohol content should be considered “adulteration.” Wetherill determined it was not. The study also alluded to “problems of food preservation and uses of chemical preservatives.”
The Pure Food and Drug Act of 1906 (or, as it was more widely known then, the Wiley Act) was the next big step in protecting American consumers from mislabeled food (or not labeled) food products. Pioneered by chief chemist of the Department of Agriculture and crusader against food adulteration, Harvey W. Wiley, the act fought against false advertising, mislabeling, and adulteration of foods and medicines. It also prevented interstate commerce of items that were not properly labeled. The opening line of the act reads,
“For preventing the manufacture, sale, or transportation of adulterated or misbranded or poisonous or deleterious foods, drugs, medicines, and liquors, and for regulating traffic therein, and for other purposes.”
Soon, the FDA – the Food and Drug Administration (it was under several different names until 1927) – was formed to regulate and uphold the law.
While the FDA did an admirable job trying to regulate labeling and prevent food “adulteration” (the addition of a non-food item to increase the weight/quantity of the food item, which may result in the loss of actual quality of food), manufacturers found loopholes. As the FDA points out, the frozen foods industry that cropped up and prospered after World War II consumed a lot of the FDA’s energy and manpower. With new products like frozen TV dinners (See: The Origin of the TV Dinner), freeze dried coffee, and “instant chocolate drink” being introduced to the market, the FDA had to figure out what constituted food adulteration and what a label on these types of foods should and should not say. Due to this lack of manpower, the FDA at this time allowed manufactures of foods that already existed to tweak their recipe without needing the FDA’s approval. This is how we get to the infamous peanut butter hearings.
Since the 1940s, the peanut butter industry had been asking the FDA if the addition of glycerin (a sugar alcohol that can act as a sweetener and food preservative) constitutes food adulteration. The FDA responded that peanut butter “is generally understood … to mean a product consisting solely of ground roasted peanuts, with or without a small quantity of added salt.” So if glycerin is added, it has to be on the label.
With Jif-brand peanut butter entering the market in 1958 and quickly becoming a huge competitor to the other main peanut butter brands, Skippy and Peter Pan (all still exist today), the manufacturers found other ways to grow their bottom line and still put “peanut butter” in a jar. For instance, prior to the late 1950s, the hydrogenated oil that was used to provide consistency was peanut oil. In 1958, manufacturers began using other, cheaper, hydrogenated oils like cottonseed, rapeseed, canola, and soy, instead of peanut oil in their peanut butter.
Jif, in an effort to overtake Skippy and Peter Pan, added sweeteners and reduced their actual peanut content to improve the flavor and increase the profit margin. According to a lab study (granted, by a lab run by Skippy’s parent company, Best Foods), Jif peanut butter contained 25 percent hydrogenated oil and only 75 percent actual peanuts. This greatly concerned the FDA and other consumer groups.
In 1958, the FDA administrated the Food Additive Amendment which established that chemicals or substances that can be “generally recognized as safe” can be used in food without further testing. The Delaney Clause said that if it doesn’t cause cancer in man or animal, the additive can be used. Of course, this would lead to big issues down the road, but it allowed them to set standards on the amount of hydrogenated oil used in peanut butter.
A 1959 press release said that mass produced peanut butters on average had reduced their peanut content by about 20 percent, which wasn’t appropriate. In response, the FDA set the standard at 95 percent peanuts and 5 percent of “optional ingredients including salt, sugar, dextrose, honey, or hydrogenated or partially hydrogenated peanut oil” in order for it to be called “peanut butter.” This did not sit well with the peanut butter industry, as pointed out by Consumer Report, “The Peanut Butter Manufacturers Association, whose members did not want to miss out any cost-cutting opportunities, opposed this standard.”
For the next 12 years (yes, years), the peanut butter case and the subsequent hearings (the Peanut Butter Hearings) would embroil the FDA and peanut butter manufacturers in a heated courtroom feud. Back and forth they went, negotiating peanut percentages to determine when something stopped being “peanut butter” and started just being a “peanut spread.”
In 1961, the FDA agreed to roll back the percentage to 90 percent to hurry along a compromise, but the manufacturers still disagreed. So, the FDA instead announced that the “issue warranted further study.” Negotiations continued for another 10 years. The FDA’s own history of the case comments that, “A prominent attorney on the case wryly observed that the peanut butter standards put many lawyers’ children through college.”
In 1965, dramatic public hearings were held, with the high-powered peanut butter manufacturer lawyers on one side and consumer activist groups, which were very much encouraged by the FDA, on the other, led by Ruth Desmond (who had become known as the “Peanut Butter Lady”). The hearings were sensational, full of narrative (like Desmond making dinner for her husband every day before she went to court), widely covered by the press, took five months, and produced over 8,000 pages of transcript. Still, it would take another five years for the matter to be settled.
In 1968, the FDA stated that their findings determined the line between what is a peanut butter and what is a peanut spread was at 90 percent peanuts, 10 percent additives. After a long appeals process, the new standard went into effect on May 3, 1971. So after vast sums of tax payer dollars spent and years of legal rambling, from that point forward, peanut butter officially had to be 90 percent peanuts. If it wasn’t, it could still be sold, but it had to be called “peanut spread,” rather than “peanut butter.” The same went with certain other foods, like jellies or jams which were also required to meet similar types of standards. This standard still remains today.
In the book, Creamy and Crunchy: An Informal History of Peanut Butter by Jon Krampner, the FDA official who was in charge of arguing the FDA’s case, Ben Gutterman, commented that, “If we had said eighty three, they’d have gone to eighty. They were saying ‘Nutritionally, it’s the same. Price-wise, it’s the same.’ We were asking, ‘but when does it stop being peanut butter?”
In the end, the lengthy and extremely expensive battle over what constitutes peanut butter vs. peanut spread resulted in shifting views concerning the food standards program in the United States, which in turn spurred on new regulations for food labeling, as well as a General Counsel being convened to make sure food regulation practices would not interfere with the creation of new types of food products. As law professor Richard Merrill noted, “We conclude[d] that regulation should shift away from controlling food composition and focus on providing consumers with more complete information about foods.”
If you liked this article, you might also enjoy:
Expand for References
- Pringles were originally called “Pringles Newfangled Potato Chips.” However, Pringles contain only about 42% potato based content, with most of the rest being from wheat starch and various types of flour, including from corn and rice. Thus, the U.S. Food and Drug Administration made them change the name because their product didn’t technically meet the definition of a potato chip. So they were only allowed to use the word “chip” in very restrictive ways. Specifically, if they wanted to continue to use “chip,” they were only allowed to say “Pringles Potato Chips Made From Dried Potatoes.” Not being too fond of this requirement, the company changed the name slightly, using “potato crisps,” rather than “potato chips.” Today, of course, most people just know them as “Pringles.”
- While Proctor & Gamble initially argued that Pringles were in fact “chips” in the U.S., they took a different tact in the U.K. In order to avoid a 17.5% Value Added Tax (VAT) in the U.K., Proctor & Gamble stated that Pringles should be considered a cake, rather than a “crisp.” Their argument was that since only 42% of the product was made from potato and the fact that it is fashioned from dough, that it should be considered a cake and not be subject to the tax put on chips. After all, that’s why the U.S. Food and Drug Administration had previously made them change from being a chip to a “crisp.” The company initially won in High Court and were briefly considered a cake in the U.K. However, Her Majesty’s Revenue & Customs appealed the decision and, in 2009, the ruling was reversed and the company had to start paying the VAT.