In the United States journalists are brought up with a very left point of view. It is very much the case that almost all professors in the U.S. are leftists, but especially true among the arts profession. Still not sure if I can link, but this can easily be checked by googling. I went to college in a red state and most professors were leftist to the extreme. I got extra credit one time for attending a Bernie Sanders rally. He was supposed to be talking about history for the record. But, it was 90% political rally.
Now for the case of Europeans I think it is nationalism. They can say all they want that nationalism is no longer in the DNA for Europe, but I think pride in a people/group is fairly ingrained for humans to an extent. While Europeans were generally friendly the times I have visited. (Except almost all Germans.) They generally had a snide offhand comments left and right about how violent/backward/poor/dumb the U.S. was compared to them. Except in Germany where the comments were not offhand and just generally straightforward of you are stupid because you are American.
As a small aside it was odd I heard French people were rude, but out of all the European countries I have been to they were generally the nicest. Not sure where that stereotype comes from.
So it wasn't just healthcare. They are convinced they are more educated, more wealthy, more free, more healthy. I don't have a problem with them believing any of it. I just don't like when their media and American media go into falsehoods to try to lead that point when there are definitely many points they could actually use to back that up without having to go with America sucked at Covid cause Orange man bad.