Hi everyone, i'm a french student in economics and sociology, and i'm verry curious about what is sociology (or social studies ?) like in USA. From what i've heard it seems it is a left-marked field. Is that thrue ? Did anyone of you studied it and is able to give me a quick insight about the programs ? also idk if this is the good chatroom to ask this question, but i feel school programs are pretty much a political matter.