Apr 6, 2014
94 Views
0 0

TROPICAL MEDICINE

Written by

In simple terms, tropical medicine is the medicine practised in the tropics. It arose as a discipline in the 19th century when physicians responsible for the health of colonists and soldiers from the dominant, European countries were faced with diseases not encountered in temperate climates. With extensive worldwide travel possible today, tropical diseases are now being widely seen in returning travellers and expatriates.

Leave a Comment

Your email address will not be published. Required fields are marked *