Organicwashing – Companies Greenwashing Our Health Food
Organicwashing: a form of greenwashing where companies market a product in a manner that is meant to convince consumers that the product is produced with people's health and animal welfare in mind, while secretly not giving a damn about anything but profit.