Is Organic Food Worth the Premium?
By Kalyn Weber, November 24, 2014
What is “organic”? We all have our own ideas about what it means to be organic. In food speak, the term “organic” actually refers to the way agricultural products are grown or raised and processed. In the US, specific requirements—which are set and regulated by the United States Department of Agriculture (USDA) – must be met and maintained by manufacturers in order for products to be labeled "organic."