Read the dictionary definition of canker. All definitions for this word.
1. a pernicious and malign influence that is hard to get rid of
1. racism is a pestilence at the heart of the nation
2. according to him, I was the canker in their midst
2. an ulceration (especially of the lips or lining of the mouth)
3. a fungal disease of woody plants that causes localized damage to the bark
4. infect with a canker
5. become infected with a canker