The Golden Globes took over most mainstream news media this week. All but four celebrities wore black to the HUGE award show, showing solidarity in the movement toward equal rights for genders, and speaking out about sexual assault. You can read more about the movement here.
I loved seeing everyone in Hollywood, both men and women, standing up...speaking out. I thought this was HUGE and IMPORTANT, however...in talking with family, friends, and even strangers, I've found that while a lot of people loved this, many thought it was just "Hollywood being Hollywood" and "Celebrities feeding their own egos". What did you think? Do you think this was done the right way and for the right reasons or do you think it was an egotistical way of staying relevant?