Triage is one of the most important concepts in modern medicine, though its origins lie far outside a hospital corridor. In simple terms, the term refers to the process of prioritizing patients based on the urgency of their condition. A cardiac arrest victim may receive immediate treatment over someone with an ankle sprain, and a road accident victim with uncontrolled bleeding will be seen before a patient waiting for a routine gastro checkup. The goal is to give critical care to those who need it most, even if that means keeping other patients waiting. This principle guides emergency rooms, disaster response teams, battlefield medics, and even pandemic planning.
The word comes from the French verb trier, meaning to “sort” or “select.” It first appeared in the 18th century in the context of agriculture and trade. Merchants used triage to describe sorting goods such as coffee beans, tobacco, or wool by quality—separating the best from the mediocre and the worthless. The term entered military medicine during the Napoleonic wars. Battlefield surgeons had to make rapid decisions about which wounded soldiers could be saved and which resources to allocate. That strategy became formalized as triage.
Today, hospitals use structured triage systems that rely on vital signs, pain levels, and visible symptoms to categorize patients into levels of priority. The approach has expanded beyond healthcare. Cybersecurity teams use triage to assess threats, and customer support centers use it to prioritize urgent cases. Even artificial intelligence systems apply triage principles when allocating computing resources.
The relevance of the word lies in its core idea: when resources are limited and needs are many, thoughtful sorting is the only way to save the most lives and deliver the most effective care.



