This is a trend happening all around the US. Cases of diseases we virtually eradicated are coming back because people are being convinced that the disease isn’t so bad, or that vaccines are some money extracting pogrom forced upon us by the government.
I keep thinking I would stop writing on this, but it keeps getting the better of me.
Get your damn kids vaccinated. If you think you somehow know better because a baby shot out from between your legs and that gave you magical knowledge that makes you know more than your pediatrician, go visit a hospital, go look at babies suffering because of parents who made similar decisions as you. You risk your own child's health, but worse, you risk the heath of the children who come in contact with you child.
That’s simply immoral.