01-31-2016, 06:08 PM
Not to derail the thread, but I think the conversation here is more interesting that the article, to be honest.
I think there is also a lot of value in terms of understanding present situations in many parts of the world, to think about the lasting effects of European colonization around the world. Like [MENTION=23097]Insertnamehere[/MENTION] brought up, the law against homosexuality that is mentioned in the article is a colonial law. Europe can pat itself on the back by thinking how good Western European societies treat people, but that is by no means a historical legacy.
Of course it wasn't just Western Europeans who have conquered and colonized other peoples, but the scale and influence of that specific example, as well as the wealth and power it brought to those countries is pretty clear. Its kind of crazy to try to imagine what England, for example, would be like if it had not been a colonial power, like if it were only its own resources and what came from trade that it had access to. What a different society it might be. I don't think that history of colonization only affects the modern situation of the formerly colonized countries.
But Western Europe isn't the same as it was 100 years ago. And I find that really interesting. I know there are a few users here who not only have a lot of knowledge about European history, but also a lot of passion for it. I'm thinking of [MENTION=21405]meridannight[/MENTION] and [MENTION=23123]Alto[/MENTION] because you two have certainly made the impression on me that you both have that knowledge and passion and that you really care about and love the histories of your countries. But my question is open to anyone that wants to answer it.
What I'm curious about is, in your opinion, what things were factors in changing Western Europe in general, or any country there specifically, into the kind of place it is today? It is pretty obvious that Western European countries have high standards of living, that they are very forward thinking in terms of how people are treated and the freedoms people have, the relative peace and stability enjoyed there.
In other words, if we were to look to Europe, or a specific country there, as an example of how a society can change for the better, what would be the most important things to do or focus on?
(if this is too far off topic let me know and I'll make a different thread)
I think there is also a lot of value in terms of understanding present situations in many parts of the world, to think about the lasting effects of European colonization around the world. Like [MENTION=23097]Insertnamehere[/MENTION] brought up, the law against homosexuality that is mentioned in the article is a colonial law. Europe can pat itself on the back by thinking how good Western European societies treat people, but that is by no means a historical legacy.
Of course it wasn't just Western Europeans who have conquered and colonized other peoples, but the scale and influence of that specific example, as well as the wealth and power it brought to those countries is pretty clear. Its kind of crazy to try to imagine what England, for example, would be like if it had not been a colonial power, like if it were only its own resources and what came from trade that it had access to. What a different society it might be. I don't think that history of colonization only affects the modern situation of the formerly colonized countries.
But Western Europe isn't the same as it was 100 years ago. And I find that really interesting. I know there are a few users here who not only have a lot of knowledge about European history, but also a lot of passion for it. I'm thinking of [MENTION=21405]meridannight[/MENTION] and [MENTION=23123]Alto[/MENTION] because you two have certainly made the impression on me that you both have that knowledge and passion and that you really care about and love the histories of your countries. But my question is open to anyone that wants to answer it.
What I'm curious about is, in your opinion, what things were factors in changing Western Europe in general, or any country there specifically, into the kind of place it is today? It is pretty obvious that Western European countries have high standards of living, that they are very forward thinking in terms of how people are treated and the freedoms people have, the relative peace and stability enjoyed there.
In other words, if we were to look to Europe, or a specific country there, as an example of how a society can change for the better, what would be the most important things to do or focus on?
(if this is too far off topic let me know and I'll make a different thread)