It's important to take care of your heartburn because it could lead to worse problems, including cancer. Doctors often give patients certain drugs for their heartburn. Yet, like many other drugs, heartburn medications can have some serious side effects.