(herbalism) the use of plants or plant extracts for medicinal purposes in order to improve the body’s natural functions and restore balance. Herbal medicines are given in many forms (liquids, infusions, tablets, topical preparations, etc.) and form part of an increasing number of complementary medical therapies. See phytotherapy.
Subjects: Public Health and Epidemiology.