Here’s a thought: if all the doctors in the land started jumping up and down warning the government against a new drug they wanted to give children, what would people do? Would people just ignore them? Would people even notice?
Why don’t people want to hear what teachers want to say about educating children? What motives do teachers have for not wanting children to succeed?
It’s time to trust teachers and take education away from politicians. I don’t know any teachers who would put politics before what’s right for the children in their care. We need to be trusted.