I feel like conservatives need a reminder that colonialism and “changing culture” meant kidnapping babies from native families and torturing them into catholicism. the most your whitness has ever been threatened by immigrants is their use of spice
Was it a bad thing when whites forced themselves into Africa to colonize them and change their culture?? If so, then why is it considered a good thing for all these immigrants to come into USA and change our culture??