In 1856, an amateur chemist named William Henry Perkin mixed a batch of chemicals that he hoped, in vain, would yield the malaria drug quinine. When Perkin’s failed experiment turned purple, a hue so vivid that it could stain silks without fading, he realized he’d stumbled upon a different marvel of modernity: a commercially viable synthetic dye, the first of a new generation of chemicals that would revolutionize the way humans colored their clothes and, soon after, their food.
The edible versions of the chemicals, in particular, were a revelation, offering food manufacturers ”cheap and convenient” alternatives to pigments squeezed painstakingly from natural sources such as plants, says Ai Hisano, a historian and the author of Visualizing Taste: How Business Changed the Look of What You Eat. Dyes could keep peas verdant after canning and sausages pink after cooking; they could turn too-green oranges more orange and light up corner-shop candy displays. By the Second World War, synthetic dyes had become, as one grocer put it, “one of the greatest forces in the world” in the sale of foods. And the more foods the chemicals were introduced to, the more the chemicals came to define how those foods should look: the yellow of butter, the crimson of strawberry Jell-O.
But after hitting a mid-20th-century peak, the roster of synthetic dyes used in Western foods began to shrink. In recent years, European countries have appended warning labels onto the products that contain them; the United States has whittled down its once-long list of approved artificial food dyes to just nine. The FDA is now reviewing a petition to delist Red No. 3, which colors candy corn, conversation hearts, and certain chewing gums and cake icings; California and New York are mulling legislation that could ban the additive, along with several others, by 2025.
The concern is that the dyes add not just colors but a substantial…
Read the full article here