Blog

How to Shop in Today’s Grocery Store

Have you recently found yourself at a grocery store, staring at a shelf of ten different jars of strawberry jelly or bottles of Italian dressing, only to find several minutes have passed and you’re still no closer to making a decision? According to the Food Marketing Institute, the average number of items carried in a supermarket in 2017 was 30,098. The ability to choose from so many options allows customers the freedom to make purchasing decisions that best fit their priorities, such as price, taste, or sustainability; but it can also make grocery shopping quite overwhelming. With so many choices, how is one supposed to turn a complex environment into a simple one? Consider the following tips for navigating the 21st century grocery store like a pro.

Remove the virtual caution tape from the packaged food aisles.

A frequently heard tip for healthy grocery shopping is to stick to the perimeter of the store. The original rationale behind this advice was that the products located around the outer edges of stores are healthier (think produce, lean meats and fish, etc.). However, this advice is becoming less applicable today, as manufacturers innovate and reformulate packaged goods in response to consumer demand and evolving regulations pushing for improved nutritional profiles and clarity around ingredients used in foods and beverages. Don’t hesitate to enter the packaged food aisles next time. Instead of shopping in a circle, consider making your shopping path “E” or “Z” shaped!

Shop smart, waste less.

According to the Food and Agriculture Organization, one-third of food produced for human consumption is lost or wasted globally, which amounts to about 1.3 billion tons per year. To avoid wasting food and money, understand the various methods used by industry to support food waste reduction efforts. One approach is to mark products with voluntary “Best By,” “Sell By,” and “Use By” dates. Many consumers consider these dates to have the same meaning – okay to eat before the date, not okay to eat after the date. This is false and can lead to wasted food. Learn what each date means here. In addition, click here to learn why certain food additives show up on ingredient lists, and how they are used to help prevent food waste.

Look up, look down, look all around.

For a food or beverage company, getting a product placed at eye level on the shelves of a major grocery store is like running a commercial during the Big Game. It is extremely costly and highly competitive. Knowing this, be sure to scan the top and bottom shelves as well for products that may be slightly less popular, but could possibly be more nutritious, delicious or affordable.

Beware of mobile apps spreading misinformation.

Companies are taking steps to help consumers identify healthier or more nutritious options more easily. These include mobile apps, many of which provide nutrition and safety analyses of products or ingredients. However, some of these apps draw unsubstantiated, radical conclusions of products and ingredients that are not based on credible science and safety studies. For example, some go as far as to claim certain ingredients that are approved as safe to consume are “very dangerous and cause side effects, allergic reactions, or hyperactivity of your child.” It is important to question the quality of the evidence on which app developers are basing these claims, and to note that these apps do not represent actual recommendations by scientific authorities or food safety officials.

If you’re not eating it today or tomorrow, buy it frozen.

Especially in the case of fresh fruits and vegetables, nutrients start to decline immediately after harvesting and continue to do so during storage. For example, a study published in the Journal of Science of Food and Agriculture showed green peas lose up to 51 percent of their vitamin C during the first 24 to 48 hours post-harvest. Fruits and vegetables found in the frozen section of your grocery store are generally picked at peak ripeness when they contain the most nutrients, and there are usually no additional ingredients added to them before freezing.

Grab meat and dairy items last.

Thanks to stabilizing ingredients such as guar gum, your favorite ice cream can withstand the inevitable melting and re-freezing that occurs between the time the ice cream is packaged to the time you scoop it into your bowl at home. However, while additives help maintain the quality and safety of foods and beverages, it’s always a good rule of thumb to hit the packaged foods and produce aisles first. Leaving the highly-perishable items on your list to the end of your shopping trip will minimize the critical time those products spend out of the refrigerator or freezer.

The Ingredient All Gluten-Free Bakers Should Know About

While the gluten-free diet was first introduced in 1941, the prevalence, availability and demand for gluten-free foods has increased tremendously over the past 20 years, with the gluten-free food and beverage industry growing to over $10 billion in 2013. There are now many gluten-free options on restaurant menus and grocery shelves, but it can be difficult to make your own gluten-free baked goods at home. However, thanks to xanthan gum, those looking to eliminate or reduce gluten in their diets can still enjoying homemade breads, cakes and pastries.

But first, what is gluten?

Ever wonder what’s behind that rustic, satisfying crust of artisan breads? Or why pizza can be so chewy? Or how cake batter can transform into a tall, fluffy dessert? The answer to all of these questions is gluten.

A protein found in grains such as wheat, rye and barley, gluten provides structure and elasticity to finished products. It does so after being activated by the addition of liquid and heat. For instance, when a pizza maker adds water to flour, it triggers changes in the flour’s gluten that creates an elastic dough. Then, when the dough is placed in the hot oven, the gluten acts to trap air bubbles in the crust, creating structure and allowing the crust to rise. As you can imagine, making gluten-free pizza crust isn’t as easy as simply removing gluten from the equation. As with all baked goods, creating gluten-free alternatives involves not only choosing a gluten-free flour, but also ensuring the technical function of gluten is appropriately replicated.

A Natural Replacement  

One common ingredient than can reproduce the elasticity and binding power of gluten is xanthan gum. Xanthan gum is a product of fermentation that stabilizes and thickens foods – providing desired texture and even dispersion of flavors. It is made from a microbe found on the leaf surfaces of green vegetables. This microbe is fermented – as we’re used to experiencing with wine and cheese – then dried and ground. While xanthan gum is used widely across the commercial food space for everything from salad dressings to ice cream, it is sold separately in stores and can also be used at home for gluten-free baking and cooking.

Using Xanthan Gum

Xanthan gum is known for being extremely versatile, and works well in recipes that use highly acidic ingredients (such as buttermilk, lemon juice or cream of tartar) and are exposed to high oven temperatures. When added to a gluten-free recipe, it works to lock in structure and moisture, making sure the final product won’t crumble and fall apart. When combined with leavening agents (such as yeast, baking powder or baking soda), xanthan gum can trap in air, helping gluten-free goods rise while maintaining shape and consistency, just like their gluten-containing counterparts. It is such an effective replacement for gluten, many store-bought gluten-free flour mixes already contain xanthan gum.

The best way to use xanthan gum is to find a flour blend or a recipe that already includes it. However, for those who want to create their own recipe using a gluten-free flour that doesn’t already contain xanthan gum, below are some recommended measurements for using xanthan gum with gluten-free flour.

  • For cookies, start with a ratio of ¼ teaspoon for every cup of flour.
  • For cakes and pancakes, substitute ½ teaspoon per cup of flour.
  • For muffins and quick breads, use ¾ teaspoon per cup of flour.
  • For breads, try 1 to 1 ½ teaspoon per cup of flour.

For gluten-free bakers looking to have their cake and eat it too, xanthan gum is here to help!

Have more questions about xanthan gum? Learn more here.

How to Interpret Headline-Breaking Science

We have all had the experience of standing in the checkout line and glancing at the tabloids advertising fad diets and miracle foods telling readers that if they eat this or that, they will be able to lose 10 pounds in a week. While most of the time it is easy to identify headlines likely that lack much truth and reliability what about news and headlines regarding scientific studies? Even highly-regarded news and media outlets often promote false or misleading claims about food and health. While doing so may be an advertising tactic, it is also likely an unintended consequence of condensing the highly complicated, ever-changing science of nutrition into a one-size-fits-all assertion.

While it’s easier to trust publications promoting scientific research than the media, both can be sources of inaccurate information and can spread misleading claims. Below are several tactics to employ the next time you come across what might be a major headline-breaking study.

Step 1 – Consider the story covering the study.

Source: Ask yourself where you learned about the study. Was it posted by a friend on social media? Highlighted on the nightly news? Published in a major newspaper? Consider whether or not the study and its findings are being interpreted and promoted to create a catchy tagline that optimizes viewership. Is it too good to be true?  Or maybe it’s the opposite, maybe its generating fear?

Who is the author? Is the article written by a trained scientist a food blogger, or a journalist? Do they have a history of writing balanced, scientifically-sound articles, or have they historically written more emotionally-charged content? Consider both the credentials as well as the potential motives of the author.

What’s the full story? It is impossible to capture all of the findings, strengths, and weaknesses of a study in a single headline, so it’s important to dig for details. In addition to bypassing the headline for the full article, you should also check out any source material to ensure it supports the headline and article. Not all writers are scientists who have the expertise to both adequately evaluate a study, and put the study’s findings into the broader context of related evidence on the topic. Further, whether the writer does so intentionally or not, they may look for parts of studies that best fit within the context of their own story. Make sure to seek out and evaluate the study, not just the writer’s take on it.

Step 2 – Evaluate the study

Consider if the study was peer-reviewed and published in a reputable journal. The peer-review process ensures unbiased experts rate the quality of how the study was organized and raise questions as needed. Further, peer-reviewed studies that are published in a reputable journals and reviews or guidelines published by international health agencies and policy-making bodies are considered to have more reliable findings.

Check the timing. Sometimes stories reference studies that have been published for a year or longer in order to try and raise attention about an issue. Make sure to check the study’s publication date to determine if the findings are current and relevant.

Identify the study type. Studies vary greatly in their quality and design. It is important to remember that association does not equal causation. Some study designs can identify associations between a behavior or an exposure and a health outcome, but can show that is it caused by that behavior or exposure. Further, as with all study types, it is essential to consider confounding variables (i.e. something that wasn’t controlled for by the researcher, but may impact the outcome of the study) and potential sources of bias, which could skew study findings.

Reputable research takes steps to reduce the risk that the researchers’ or participants’ preference or bias, or even chance could affect the study’s results. It is important to be aware of the fact that not all research is created equal, and thus their results are not equal. This guide from FoodInsight may help you navigate the different types of studies and what the implications of the results may be, when the studies are carried out appropriately.

In following these steps to properly evaluate headline-breaking studies, you could be part of the solution to the growing problem of the dissemination of bad science and deceiving headlines. Time to put your white hat (and reading glasses) on.

Bold Food Concepts and Abstract Ingredient Combinations Shine at IFT 2018

Every summer, the Institute of Food Technologists (IFT) hosts its annual conference to bring together passionate people working to innovate within the food industry space. The focal point of the multi-day event is food ingredients, and the exciting opportunities to use them ingredients in new products. This year’s conference was held in Chicago from July 15-18, and featured presentations that explored trends in food business and innovation, along with an exhibition hall that featured food and food ingredient companies of all sizes, from startups to large multinationals. As with previous years, a few key trends stood out, ultimately forecasting what’s to come by way of new products headed to store shelves. Below are the top trends to look forward to this coming year:

  1. New Technology, Futuristic Solutions:  While the food industry has been shifting over the past several years towards products that promote wellness and sustainability, IFT2018 highlighted the first ever IFTNEXT Food Disruption Challenge, a competition that allows emerging food companies and entrepreneurs to pitch new products or processes leveraging modern technology to enhance the global food supply. The finalists chosen to share their innovations represented a diverse set of breakthrough solutions in the ingredient, packaging and sustainable agriculture space. The people’s choice award for Future Food Disruptor of the Year went to a processor of insect ingredients as a more environmentally-sound alternative to livestock production. The company’s protein concentrate may be used for sports nutrition products and certain beverages, while their textured insect protein may be used as a meat replacement for burgers or nuggets, or as an alternative to eggs or butter. However, the judges’ pick for the competition’s grand prize went to a company transforming an otherwise wasted by-product of soy milk production called okara into a gluten-free flour.
  2. Focus on Coffee: Beans have left the cup and are headed for the snack aisle. Producers are using new extraction technologies to bring dynamic coffee flavors to a range of products. A wide variety of confections, from cookies to cakes, featured classic coffee house flavors such as ‘latte,’ ‘espresso,’ and ‘cappuccino.’
  3. Color & Texture: As novelty and variety continues to entice buyers, many brands featured unusual textures and colors in everything from teas to jerky. Products with bright and enticing colors, such as turmeric yellow, abounded and sparkling beverages prevailed. Additionally, new textures such as kelp jerky were featured as consumers seek out “unique textural experiences.”
  4. Florals: Regardless of the season, botanicals are in spring. Companies are adding fresh, bright and seasonal floral flavors to new products. Blooms such as hibiscus, violet, honeysuckle, rose and elderflower were increasingly popular in the exhibition hall, contributing new color, taste and aroma to packaged foods. However; as this trend is still in its infancy, most of these florals are being paired with other more familiar flavors to ease consumers into the trend.
  5. Salt Reduction Strategies: Companies specializing on savory items debuted products that work to deliver great taste while reducing the amount of sodium listed on a label. One booth presented new flavor enhancers that offered prominent umami and kokumi notes, allowing products to use less salt but still deliver satisfying, rich flavors. Hydrocolloids such as carrageenan are also being used to help reduce the salt content of foods such as lunch meats.

Find a full recap of this year’s show at iftevent.org and learn more about many of the food ingredients used in these new products here.

New York Hosts Summer Fancy Foods Show

The Specialty Food Association (SFA) hosted its bi-annual Fancy Food Show in New York City from June 30 to July 2, 2018. The international event included over 2,400 exhibitors at a huge three-story trade show known as a main attraction for retail food distributors and food media editors to come and scout out new food and beverage products their customers and readers will love.  The focus is mainly on new packaged foods hoping to be the next best superfood snack or unique cooking ingredient to gain traction with modern day consumers. While the Fancy Food Show is one of the best opportunities to debut and promote new products, the success of these foods depends mostly on the combinations of food ingredients used to create new and exciting consumer experiences. Below are the top food trends and new products to look forward to this coming year:

1: Functionality: While prepared and packaged foods have always provided convenience, not all are recognized for their positive nutritional qualities. However, health and wellness prevailed at this year’s show. Many products are now balancing flavor and wellness, packing vitamins, protein, probiotics and more into foods and beverages that promise both great taste and good health.

2: Cauliflower: It looks like cauliflower is the new kale. While cauliflower-rice, “steaks” and purees now show up on restaurant menus, cauliflower is continuing to make its way to the packaged food aisle in increasingly creative ways. This year’s show featured cauliflower pretzels, pizza crusts and even a cauliflower-based baking mix.

3: Ice Cream: A whole new class of exciting ice cream flavors debuted at this summer’s show. Standouts included those with unexpected flavors and ingredients such as black sesame, toasted rice and even purred vegetables, such as flavors like vanilla with zucchini, mint chip with spinach, and strawberry with carrot. Most of these products rely on emulsifiers and thickening ingredients, such as cellulose gum and gellan gum, to provide the creaminess and texture that customers love and expect with ice cream.

Find a full recap of this year’s show at specialtyfood.com and learn more about the ingredients that make all of these new products possible here.

Irish Moss: The History of Carrageenan’s Roots

If you have ever checked the list of ingredients on your favorite ice cream, yogurt, chocolate milk or frozen pizza, you’ve probably seen carrageenan listed. Whether you have noticed it before or not, carrageenan has been used in packaged foods for over 50 years, and its history in the world’s food supply dates back even further.

Chondrus crispus

Carrageenan is made from a type of red seaweed known as Chondrus crispus.  Archaeologists estimate humans have been harvesting seaweed, like Chondrus crispus, for nearly 14,000 years. Evidence of red seaweed’s medicinal benefits in China can be traced back to 600 BC, and it was originally used as a food source around 400 BC on the British Isles.

Often referred to as Irish moss, the thick seaweed used for carrageenan grows abundantly along the rocky coastline of the Atlantic, including the shores of the British Isles, North America and Europe. This seaweed is especially abundant along Ireland’s rocky coastline, where it has been cultivated for hundreds of years for both its gelling properties in foods as well as purported medicinal purposes. In fact, carrageenan’s name comes from Carrigan Head, a cape near Northern Ireland, the title of which was inspired by the Irish word “carraigín,” which translates to “little rock.” In the 19th century, the Irish believed carrageenan could cure sick calves along with human colds, flu and congestion. First, the seaweed was harvested and laid out to dry. Then it was washed and boiled before being added to flans, tonics and even beer. Used similarly to gelatin, carrageenan became a key ingredient in the classic Irish pudding, Blancmange, a delicately-set cream dessert. Blancmange is still made in Ireland, where whole pieces of dried red seaweed can be purchased in local markets.

The Irish Potato Famine

Carrageenan was also used to combat nutritional deficiencies in the 1800s during the Irish Potato Famine. The red seaweed was added to warmed milk with sugar and spices to create a fortified beverage. This drink is still consumed today in both Ireland and the Caribbean. As Irish immigrants fled famine and came to the United States, the first American seaweed farming production was established off the coast of Massachusetts. However, it wasn’t until World War II, when a similar ingredient called agar was no longer available, that carrageenan soared in popularity in the US food supply.

Carrageenan Today

Since the mid-20th century, carrageenan has and continues to be used in many products such as chocolate milk, ice cream, frozen foods and many organic items. It is now consumed in nearly every region of the world, including the US, Europe, China, Japan and Brazil. For more information on carrageenan, please review our Sources of Food Ingredients or visit Marinalg.org.

Changes Headed to a Food Label Near You

In the spring of 2016, the U.S. Food and Drug Administration (FDA) announced new changes to the Nutrition Facts Label for packaged foods. The changes were made to allow consumers to make more informed and healthful decisions in their diets. While you may have already seen this new format on food products, the FDA has extended the compliance deadline to 2020, although manufacturers with less than $10 million in annual food sales have until Jan. 1, 2021 to comply.

So, what’s different?

  1. New Look, Bigger Font
    The type size for the total calories, serving size and number of servings has been increased and bolded. Along with making the serving size more visible, the actual size of each serving has been updated to reflect a typical serving size. However, the serving sizes listed on food products are not recommendations from the FDA but rather a measurement which is intended to reflect realistic intake. For example, the serving size for ice cream was previously 1/2 cup, and is now 2/3 cup.
  2.  “Added Sugars” make it to the label
    A line for “Added Sugars” has been added to the label beneath the listing for “Total Sugars” to help consumers understand the amount of sugar that is being added to a product. This means that the number does not include the naturally-occurring sugar found in fruits and vegetables. Naturally-occurring sugars are accounted for in “Total Sugars” on the label. These new designations are meant to help consumer understand the source of sugar.
  3. “Total Fat” to replace “Calories from Total Fat”
    Research has shown the type of fat (e.g., polyunsaturated fat) is more important to consider than the total calories from fat alone. Therefore, FDA has chosen to remove calories from total fat, but will continue to require listing of “Total Fat,” “Saturated Fat” and “Trans Fat.”
  4. The amount of vitamin D and potassium required to be listed
    This change is based on research from the Institute of Medicine (IOM) which shows that Americans do not always get the recommended amounts of vitamin D and potassium. These vitamins will be required to be listed in order to increase visibility of their requirements. Similar information for vitamins A and C may still be included, but their inclusion is now voluntary as deficiencies of these vitamins are rare today.
  5. New footnote on “Percent Daily Value” (% DV)
    The footnote at the bottom of the label has been updated to read as: “The % Daily Value tells you how much a nutrient in a serving of food contributes to a daily diet. 2,000 calories a day is used for general nutrition advice.” This change has been made to better explain what daily value means.

Facts Up Front

While the above changes will be required and regulated by the FDA, manufacturers can opt to include an additional ‘Facts Up Front’ label on the front of packaging. Introduced by First Lady Michelle Obama in 2010, this nutrition labeling system places the amount of calories, saturated fat, sodium and sugars per serving side by side in a simple format on the front display area of a food product. Small packages that cannot fit all four nutrients may display only one icon, for example, calories per serving. If the package size permits, manufactures may also include up to two “nutrients to encourage” if the product has more than 10 percent of the daily value per serving of potassium, fiber, protein, vitamin A, vitamin C, vitamin D, calcium or iron. This optional label is designed to act as a convenient tool to help consumers understand the nutrient quality of foods at first-glance.

History in the Frozen Food Aisle

Frozen foods have long been a staple in the Western diet, but they have evolved considerably in terms of safety, quality and packaging compared to the first commercially-sold frozen food.

Early Days & Fish Fillets

The founder of frozen food as we know it today (a la a bag of frozen peas in our freezer) was Clarence Birdseye. Birdseye was inspired to formulate a mechanized fast-freezing method after watching Inuit tribes of Labrador, Canada, preserve freshly-caught fish with the wind, ice and cold weather. As the climate in his hometown of Brooklyn didn’t allow for such fast freezing, Birdseye invented the multi-plate freezing method, in which food is pressed between two chilled metal plates. The method then grew to involve two chilled conveyor belts for faster freezing. The first machine was designed to only freeze haddock and “seal in every bit of just-from-the-ocean flavor,” as noted by Birdseye himself (check out a drawing here). After this invention, the foundation had been laid for the entire frozen food industry.

Conveyor Belts to Home Kitchens

Birdseye established the General Seafood Corporation in 1924, offering only frozen fish fillets. As production increased to include meats and produce, Birdseye joined forces with the Postum Company in 1929 to create General Foods Corporation. With this expansion, Birdseye launched a marketing campaign to familiarize Americans with the new frozen food category. However, Birdseye’s frozen food didn’t reach mass popularity until the 1940s, when most American households purchased their first freezer. In addition to updating the frozen boxcar to transport his foods, Birdseye was also involved in developing grocery store freezer display cases, which led to the sale of frozen television dinners and fish sticks.

Did you know? Swanson executives came up with the first frozen dinner trays after a turkey surplus left them with too much turkey after Thanksgiving.

Today, food technologists and scientists continue to improve methods and ingredients for freezing. While Birdseye’s method of flash freezing remains popular, there is also air blasting, spiral belt, and cryogenic freezers to name a few.  Additionally, foods that were hard to freeze  can now be flash frozen with the addition of ingredients, for example, to prevent enzymatic reactions in fresh produce and  preserve texture in frozen desserts. Modern innovations in “smart packaging” also maintain freshness and keep products from thawing (think the plastic film over your favorite frozen mac and cheese).

Considering frozen food? Here are the top 5 reasons to go frozen!

  1. Cut down on meal prep
    Pre-cut frozen vegetables are usually faster and simpler to cook than whole, unwashed produce.
  2. Get more nutrients
    Since frozen foods are packaged at peak freshness, they often contain more nutrients than their fresh counterparts.
  3. Reduce food waste
    Frozen food reduces the amount of food thrown a way due to spoilage.
  4. Easy portion control
    Many frozen foods are sold in single-serve packages which allow for easy for portion control.
  5. Cut food costs
    Frozen produce is typically much less expensive than fresh.

The best way to prevent freezer burn? Make sure your item is in an air-tight container! As something starts to freeze, water evaporates and freezes when it hits the air, creating ice crystals and altering the texture of foods.

From Appert to the Ball Brothers: a history of canning

It’s hard to imagine a world without a jar of strawberry jam in the cabinet, beans from the tin, a can of tuna salad for a quick lunch or a trusty can opener. And while preservation methods such as drying, curing, freezing, pickling and fermenting have deep-roots in ancient food cultures, the process of canning is fairly new. In 2013, the Can Manufactures Institute estimated the US and Europe go through 40 billon cans a year – a far cry from when just one can could take over six hours to make and weighed around seven pounds.

The Great Seal

The process of preserving food in a hermetically-sealed jar or tin was the answer to a problem proposed by the French and English governments as their armies subsided on salted meat and hardtack — the need for more nutritious and non-perishable food was great. In France, Napoleon saw the toll poor nutrition took on his men and launched the Preservation Prize in 1795, offering 12,000 francs to anyone who could improve the process of preserving food. In 1810, French chef Nicolas François Appert offered a solution – canning.

As a chef, confectioner and scientist, Appert made many contributions – the invention of bouillon cubes, nonacidic gelatin extraction and improvements in the autoclave – however; food preservation is what earned him the greatest praise (and 12,000 francs). Appert created a method of hermetically sealing glass jars with cork, wire, wax and boiling water. Appert believed the key to non-perishable foods was to heat and seal jars to keep decay out. The understanding of bacteria’s role in spoilage would not be fully understood until Louis Pasteur discovered the process of pasteurization in 1863.

Appert published his work in L’Art de conserver, pendant plusieurs années, toutes les substances animales et végétales (The Art of Preserving All Kinds of Animal and Vegetable Substances for Several Years) in 1810. For those that purchased the book, a small note attached to the cover included Appert’s address so that skeptics could come to his home and purchase preserved goods.

While Appert’s method was effective in preventing spoilage, the glass jars were cumbersome and had the tendency to explode. The answer to these issues came from England, where the government was also struggling with supplying sustainable rations to its navy and arctic explorers. In June of 1813, Bryan Donkin served King George III and Queen Charlotte canned beef… from a tin. British merchant Peter Durand patented the method of storing food in cans made of tin on behalf of French national Philippe de Girard (who invented the method) in 1811. Durand sold the patent to Donkin, who was able to deliver canned food to the royal table and produce cans on a larger scale. Following approval from the Royal Family, Donkin’s cans were immediately placed on British ships. One surgeon aboard a naval vessel in 1814 noted that the tinned food offered “a most excellent restorative to convalescents, and would often, on long voyages, save the lives of many men who run into consumption [tuberculosis] at sea for want of nourishment after acute diseases; my opinion, therefore, is that its adoption generally at sea would be a most desirable and laudable act.”

Across the Pond

The first can arrives in America in 1825, as Thomas Kensett and Ezra Daggert sell their patented cans filled with oysters, fruits, meats and vegetables to New Yorkers. However, canned food doesn’t achieve commercial success in the USA until Gail Bordon’s 1856 invention, condensed milk. Milk was hard to keep fresh and was costly to source in urban areas, such as New York. Bordon’s Condensed Milk addressed a growing problem. When Civil War breaks out, the demand for canned food and milk increases exponentially.

The one caveat to canned food at the time remained how to open it. Early cans were often reinforced with stronger metals, and a hammer and chisel or knife were the only ways to open them. The first incarnation of a can opener isn’t invented until 1860 by American, Ezra J. Warner. Still slightly crude and cumbersome (used mostly through the war and by shop clerks), a more commercially-friendly opener doesn’t arrive in home kitchens until the 1920s.

As can consumption increased, so did the science and methodology behind safer canning. In 1895, a team at the Massachusetts Institute of Technology (MIT) tried to solve the problem of smelly canned clams that swelled with gas released by bacterial metabolism. Researches Samuel Cate Prescott and William Lyman Underwood found the bacteria that caused the cans to swell was not affected by the boiling of the cans but instead by “applying pressurized steam at 120 ˚C [which] killed the bacteria in 10 minutes.” This finding disrupted the industry, changing the ways cans were created and adding pressure to the process.

The Home Front

Home canning was slower to take off than tin. The USDA made its first reference to the canning process in the Farmer’s Bulletin 359 from May 1909, entitled “Canning Vegetables in the Home” followed by “Canning Peaches on the Farm” in 1910.  These guidelines outlined the safest method for home canning, known as fractional sterilization, a multi-day process where jars are boiled three times for an hour each. Additionally, home canners no longer relied on Appert’s method of corking jars, following John L. Mason’s creation of the metal screw-top in 1858 and Alexander H. Kerr’s two-part canning lid developed in 1915 (the lid most canners use today).

Tin can production increases to feed soldiers through World War I and World War II – home canning also sees a large increase during this time. Communal canning centers are established in WWI with the help of the Ball Brothers Company and ‘pressure canners,’ placed on top of a stove in home kitchens, become available. Canning reaches its peak during WWII, as food rations for both the front line and home front are cut. As sugar was highly prized and highly rationed, households that canned would receive extra pounds of sugar, which increased the popularity of canning tremendously. Although, as food rations were lifted, the incentive to can decreased and so did home canning.

Canning Today

Home cooks around the world continue to can, but it’s far from the amount in the 1930s and 40s. Canning is still an excellent way to capture the taste of a season – from peaches and tomatoes in summer to apples in the fall. If you’re interested in taking up canning, there are a few helpful additives and food ingredients that will help you produce better results in your kitchen. If you’re starting with something savory, canning/pickling salt or salt substitutes (which offer the same salty taste without the increase in sodium) create excellent pickled products.  Acidulations, or acids, are a key component in canned produce. Sources for including acid includes vinegar, lemon juice, citric acid or even ground Aspirin. To add a touch of brightness to your mason jar, there are color enhancers and colorants. This includes citric acid for preserving the color of just-cut fruit, ascorbic acid to prevent browning and sulfites to both prevent spoilage and the changing of colors. Finally, when canning items with a high proportion of liquid, there are texture enhancers and thickening agents, such as food-grade calcium chloride or a variety of starches to thicken. Pickling lime can improve your pickles and pectin will yield better-canned fruits.

For official guidelines on home canning, consult the USDA’s Complete Guide to Canning or National Center for Home Food Preservation’s safe canning guidelines.

Don’t fear ingredients in your food!

The term “chemophobia” is defined as an aversion to or prejudice against chemicals or chemistry. It also refers to an exaggerated or irrational distrust of certain foods, including food ingredients or food additives. Over the past several years, food companies and the media have perpetuated chemophobia amongst consumers by declaring the removal of certain ingredients or additives from their products. These announcements are typically not based on safety reasons but rather because the scientific name(s) of ingredient(s) are unfamiliar or sound intimidating. The fact is, while consumers have every right to avoid certain foods or ingredients based on personal preference, there’s no reason to be fearful of them. All components of food are safe and regulated by government authorities responsible for protecting public health. Let’s take a look at the truth behind claims that certain ingredients are “scary” or “unclean” and conquer chemophobia once and for all.

  • The term “clean” is appropriate after washing dirt off your produce, not when interpreting labels. Regardless of the length of ingredient lists or the way ingredients sound, foods that contain unfamiliar ingredients or additives are not “dirty.” In fact, in many cases they help ensure that foods are safe to eat and free of pathogens that could cause foodborne illness. The trend of using the term “clean” to describe a diet that is free of additives has not only created a misconception that it is a safer way of eating, but it is also now falsely associated with positive health outcomes such as weight loss. The truth is that reducing the amount of calories consumed, not the amount of ingredients or additives, is what helps produce weight loss.
  • Food science is beneficial, and shouldn’t scare you. One common tactic used by groups to paint food ingredients and additives in a negative light is to suggest their names should scare us. Ingredients like xanthan gum, titanium dioxide and sodium phosphate may sound odd, but oftentimes additives are named based on their original sources, such as minerals, salts, or other naturally-occurring substances. What’s more, these additives play important technical roles in foods, such as enhancing their nutritional value, improving texture or consistency, making foods more convenient to prepare, extending shelf-life, and contributing to a more sustainable food supply. A quick online search can help you identify where an ingredient’s name originates and what purpose it serves in a food.
  • All foods are complex, meaning they contain manynaturally-occurring ingredients. In 2013, James Kennedy, a renowned chemistry teacher and blogger, published a poster series called the “All-Natural Banana.” This series showed the abundance of chemicals and ingredients that occur naturally in foods and fruits enjoyed by consumers every day. A banana, for instance, naturally contains over 50 ingredients that include maltose, proline, tyrosine and myristic acid. In this series, Kennedy addressed the fact that natural foods are typically more chemically complicated than foods considered to be manufactured or processed. Some of these naturally-occurring ingredients may be potentially harmful if consumed at extremely high levels, but the government prevents this by regulating the levels of ingredients food companies are allowed to use. Further, these ingredients are not consumed alone in concentrated forms, but instead in the context of a total diet.
  • Only “food-grade” ingredients can appear in foods. A common fear promoted by self-described health “experts” is that if an ingredient appears in a nonfood item, it has no business in foods. What these individuals fail to recognize is that some ingredients have different grades depending on the application. For example, an “industrial-grade” phosphate is produced under different conditions than a “food-grade” phosphate and is not allowed to be used in foods. Phosphate food ingredients must be made under strict manufacturing conditions directed by the laws enforced by the US Food and Drug Administration (FDA) and other regulatory agencies.
  • A long ingredient list doesn’t determine the healthfulness of foods. Governing agencies not only regulate the safety and content of foods, but also how their ingredients appear on labels. For example, the FDA requires that the ingredient list baking powder includes all sub-ingredients, which looks like this: baking powder (sodium bicarbonate, sodium aluminum sulfate, cornstarch). Imagine how long an ingredient list for a whole grain baked item would be! By law, nutrient-dense products with a combination of ingredients and flavors are obligated to include long, scientific-sounding ingredient lists.

The bottom line: Not recognizing or knowing the origin of an ingredient name should prompt curiosity, not fear, since the consumer safety of food ingredients is determined and monitored by qualified scientists and government agencies. Do your own research about ingredients you may not be familiar with and base your opinions on high-quality, peer-reviewed scientific studies, not marketing campaigns. Don’t let chemophobia determine what foods you should or shouldn’t eat, or hold you back from eating the foods you enjoy!

To learn more about the different food ingredients in your food and where they come from, check out the Facts on Food Ingredients webpages on this website.