Why do people think that if you don't talk about the possibility of cancer it will somehow make it go away. If you were diagnosed with a tumor in your head and the doctor says he feels it is growing shouldn't you be concerned? Why is some people think that if you don't talk about it, it will miraculously disappear and go away on its own?