Most Americans take multivitamins because the media tells them that the vitamins will help with their health and they will get sick less often. Doctors have done three studies and the results have all come back the same. There has been no evidence that taking these vitamins are beneficial to adults who take them. Some doctors are even saying that these vitamins may actually be harming people who take them. “Consumers are now being urged to ‘stop wasting their money’ on things that won’t protect you in the long run.” If these vitamins did reduce risk factors it would be very beneficial to the public.