I've always been left leaning, politically speaking that it, or as some of you Americans mights say 'liberal'.
I guess because i do believe that the state should look out for those members of society who can't look out for themselves, and because i believe in a progressive taxation system (i.e. the more you earn the more you pay), and because i believe in universal free health care for all, and because i believe that education should be a right and not a privilage (including higher education), I guess all of that defines me as left wing.
But there's a catch. I also support fully the US led war in Iraq and the international fight against terrorism. Two things which clearly today are associated with the less then 'liberal' political fringe.
So my questions to you all is this: How much have the events of the past three years been responsible for re-defining the political map? Can politics be simply defined as left/right wing anymore?
I guess because i do believe that the state should look out for those members of society who can't look out for themselves, and because i believe in a progressive taxation system (i.e. the more you earn the more you pay), and because i believe in universal free health care for all, and because i believe that education should be a right and not a privilage (including higher education), I guess all of that defines me as left wing.
But there's a catch. I also support fully the US led war in Iraq and the international fight against terrorism. Two things which clearly today are associated with the less then 'liberal' political fringe.
So my questions to you all is this: How much have the events of the past three years been responsible for re-defining the political map? Can politics be simply defined as left/right wing anymore?
Comment