It is, and there are some interesting techniques published recently to help mitigate things like this. But if you don't have a good ground truth you're at the very least flying blind and at worst feeding garbage in and getting garbage out; your models will learn what you tell them to learn.