I always dread the moment at restaurants when I have to tell the waiter what I can’t eat. As much as I try to avoid ordering anything I’ll have to make major changes to, there’s always the inevitable Is this dish cooked in butter? Do you have any bread without egg in it? Often, I ask questions the waiter doesn’t even know the answer to, and when they head back to the kitchen to check, I remember that most people eat without total awareness of every single ingredient in whatever they’re consuming. I worry sometimes that my consciousness of what I’m eating makes me seem hyper-cautious, obsessive, unable to let loose, even though I know underneath that every precaution I’m taking is necessary for my health.

The problem is that dietary restrictions are often viewed as if they’re a choice, or even something people do just to be difficult. I’ve had people who knew about my allergies call me a picky eater, comment that I “don’t do dairy,” or refer to what I can’t eat as food I “don’t like.” I’ve even caught myself chastising my own eating habits—calling myself inflexible for eating the same sandwich from the college dining hall despite the genuine lack of other options.
Some of this relates to the recent popularity of fad diets, like gluten-free or plant-based, which, because they’re only medically necessary in a small fraction of cases, have created stereotypes that say dietary restrictions are just a trend for wealthy or high-maintenance people. Though these stereotypes may seem minor or harmless, they cause real issues when people start to assume that anyone abstaining from a certain food is doing so by choice.
For example, vegan, plant-based, and other labels for dietary restrictions tend to be marketed towards people following a diet by choice rather than those who need to for health reasons. This means they’re not always safe for people who actually have food allergies (Note: while I’m focusing on food allergies in this piece for convenience, this also applies to other diet-related conditions like Celiac disease)—sometimes because of cross-contamination and other times because they’re flat-out misleading. For example, I’ve made the mistake of assuming that the option to order a drink with oat milk would mean it was dairy-free and found out the hard way that there were other ingredients in the drink containing dairy, with the alternate milk options not geared towards those who actually need them.
Even more importantly, the assumption that having dietary restrictions is for rich people obscures the fact that food allergy rates (in the US, at least) may actually be lowest for those in the highest income bracket, with the cost of managing allergies causing serious a financial burden for many people.
Associating allergies with wealth isn’t just a judgment people make in social situations. A 2015 study that spoke to those best positioned to help combat the challenges of managing allergies with limited resources—including allergists, dietitians, food bank employees, and social assistance caseworkers—found that many of these individuals also believed in this link between allergies and income. In this case, they cited a theory called the hygiene hypothesis, which suggests that allergies are caused by a lack of unhygienic experiences and infections needed to strengthen the immune system in childhood. While this theory has little evidence to support it, participants in the study used it to reason that higher-income individuals are more likely to have allergies, applying classist assumptions about the link between hygiene and socioeconomic class.
This study also found that participants describe low-income patients as less vigilant about their allergies and less cooperative with clinicians’ treatment guidelines. However, research on lower-income allergy patients themselves doesn’t align with this at all—instead, it shows that allergies present a constant stressor that most individuals are hyperaware of. While these individuals may be less likely to carry an Epi-Pen or effectively avoid allergens when eating, this is more likely related to the fact that managing allergies safely requires resources that are difficult to access.
Allergen-free foods, for example, tend to be priced up—treated almost like a luxury item, a financial decision you can eliminate if you’re trying to cut back on spending. The problem, of course, is that allergies are a health issue, something that can affect anyone, against their will and regardless of their income. If you genuinely cannot consume other food, this makes the priced-up food a medical necessity.
Accessing food not only creates a financial burden but can also exacerbate existing food insecurity, a term for limited or uncertain access to food. The US has a particularly high rate of food insecurity due to its lack of a social safety net in place to address this, and the resources that do exist to help increase food access often present challenges for people with allergies. For example, organizations like food banks often lack procedures to make food allergy-safe and tend to provide prepackaged food that prevents people from selecting options they’re actually able to eat. Similarly, SNAP benefits, a form of government assistance that can be used to purchase foods, are only usable at certain authorized food retailers, which are typically limited in allergen-free food.
Combined with the high price of Epi-Pens, which are crucial for treating severe allergic reactions, the barriers to accessing allergen-free food make it difficult to prevent reactions without paying a high cost. Because of this, those who already have fewer resources end up having more allergic reactions and spending more money to treat them. For example, it’s estimated that families with a food-allergic child who are in the lowest US income stratum spend about 2.5 times more on emergency room costs. This creates a vicious cycle—fewer resources mean less protection from reactions, and more reactions inflict greater costs.
Having limited financial resources influences how a person’s allergies are understood by the people around them—something that can seriously affect the risk of reaction. Carrying an Epi-Pen, for example, doesn’t just keep someone safe in the aftermath of a reaction but can also signal to others that your allergy isn’t something to be trifled with, leading them to take more caution. This may be especially important for children, who tend to rely on the adults around them for help managing their allergies.
Similarly, having time and resources to advocate for allergy accommodations can affect what policies end up getting put in place. For example, there’s no universal standard for allergy protocols in US schools, meaning allergy regulations are often left up to what parents lobby for. This makes schools where parents have more resources far more likely to put structures in place that protect kids with allergies.
Still, common narratives about food allergy don’t tend to acknowledge the way socioeconomic class and access to resources alter the way the allergy is experienced. Instead, the focus tends to be on getting allergy accommodations in places like restaurants, airplanes, and college campuses—reforms that are extremely important but don’t recognize the fact that not everyone has basic access to food in the first place.
In his book Allergic Intimacies, Michael Gill examines the way children’s books about food allergies frame the condition to their readers, noting that most focus on encouraging children not to let their allergies disrupt their childhoods. While this message may seem harmless, it also means these books only really apply to children whose allergy-related needs are already being met. For example, Gill explains that these books falsely portray families as if they all have unlimited time and resources to advocate for their children’s needs, while also implying that allergies will be taken seriously in school or social settings without any resistance from others.
Gill also notes that every single children’s book he found about food allergies featured a white protagonist, most often a boy. This is similar in other media, such as advocacy campaigns or news about allergy deaths, which tend to center on or use images of white individuals even though this doesn’t reflect the actual distribution of allergies across the population. For example, white Americans are the most likely to receive an allergy diagnosis and to have access to resources needed for allergy management, but Black Americans are the group actually experiencing the greatest burden of allergic disease in the US, including increased mortality, hospitalization, and allergy severity. Similarly, white children in the US are less likely to have food allergies than any other racial group.
Even framing food allergies as something that mainly affects children promotes false assumptions. This practice is common in allergy research and advocacy—when researching the topic for class, I’ve found it difficult to find studies not centered on children with food allergies or their parents. It may seem like children’s allergies are emphasized because of the unique risk posed by their dependence on adults, but teens are actually the age group most at risk of severe allergic reactions. After all, this is the age when parents start handing over the responsibility of managing the allergy, while peer pressure can make it extra challenging to manage allergies safely without missing out socially.
And the focus on children definitely isn’t because everyone’s growing out of their allergies—in fact, food allergies are actually more common in adults. However, studies also show that adults with food allergies experience more allergy-related social exclusion than children—likely because their allergies are less expected and therefore less understood by the people around them.
After presenting about a project I did on food allergies for a sociology class last fall, one girl raised her hand and made an interesting observation. She described how, when eating at restaurants with her family, waiters barely batted an eye at her shellfish allergy, while her cousin, who was severely allergic to chicken, received much stronger reactions—often involving uncertainty about whether an allergy-safe meal would even be possible. Overall, she explained, these two allergies had pretty much nothing in common in terms of management.
This probably isn’t because chicken is just harder to avoid. After all, I’m also allergic to shellfish, but I’ve had no problem eating in restaurants where 95% of the menu contained shellfish. That’s because shellfish allergies are common, meaning most waiters are aware of them and most restaurants have protocol in place to handle them. Meanwhile, chicken allergies are extremely rare—many people may not even believe that they’re real.
While it’s possible to be allergic to basically any food, there are more structures in place to help prevent exposure to more common allergens, and there’s also much greater public awareness. Most people reading this have probably met at least a handful of people with a nut allergy, but how many people do you know who are allergic to lettuce? If you’re wondering what that’s like, just check out this piece by Cindy Kaplan.
The allergens that are most common, and therefore most protected, vary by country. In the US, the allergens labeled on packages are those known as the Top 9, which cause about 90 percent of allergic reactions in the US and are regulated more strictly by allergy-related law. In some ways, having a Top 9 is useful, allowing for “Allergen-Free” foods that leave out the majority of ingredients people are avoiding. However, these foods aren’t “Allergen-Free” for everyone, and if you’re allergic to something that falls under the other 10 percent, it can be significantly harder to get taken seriously. Plus, there are racial discrepancies in how likely someone’s allergies are to be in the Top 9—Black Americans, for example, are more likely to have allergies outside the Top 9 than the general population, meaning they are also more likely to lack the protection and understanding given to those with Top 9 allergies.
So the stereotypes of the childhood peanut allergy and the elitist gluten-free diet do more than just fail to reflect food allergies in the real world—they also impact who is taken seriously for their allergy in both social and medical situations, influencing the risk of reaction.
Many allergy advocates have attempted to reframe the idea that allergy management is an individual responsibility by emphasizing the role of the external environment in increasing or decreasing the risk of reaction. Even so, this still tends to focus on increasing allergen labeling in restaurants or adding accommodations to prevent airborne reactions on planes—reforms that don’t take into account the fact that food allergy is not only a medical issue but also a food justice issue. For example, making a school cafeteria a safe environment for children with severe allergies to eat in isn’t enough if this still requires that these children bring their own food from home. Since many children rely on the school cafeteria for access to food, cafeterias without allergy-safe options actually affect whether they’re able to eat at all.
Therefore, allergy advocacy needs to expand further, considering needs like subsidized allergen-free food, better food allergy protocol in food banks, and solutions to help reduce economic inequality and increase access to food and healthcare in general. For this to happen, we have to stop defining food allergies based on stereotypes and instead recognize the ways common assumptions are making it harder for many people’s basic needs to be met.