I'm more convinced than ever that the left needs to frame their arguments around the right to health and medical care to counter the culture wars waged by the ruling classes.
This is especially true in the USA where it's become blatantly obvious over the past few weeks that health transcends political divides.
Healthcare is a fundamental issue.
And also, we in the West are getting older... We're going to need it.