Everyone knows Fox News is jaded towards a right-wing stance. but the little i've seen of American news is always biased to either a left-wing or right-wing position.
Is there any major news source in the states that totally isn't jaded one way or the other? We have the same deal over here (and everywhere else) but it's never ever as blatant as in the states (apart from in the tabloids)
The reason I ask is because I hate being told what I should think, I just want facts, I don't want an anchor spouting rhetoric at me. it all just seems a bit...sinister