The fences are up, the banners are down. That can only mean one thing — that’s right, more campus construction.
The College seems to be addicted to the never-ending string of new projects, all meant to help improve our surroundings. However, there comes a point where enough is enough.
So much money is being put into renovating locations that, quite frankly, don’t appear to need them, especially when there are other things toward which the money should be used.
Many housing locations on campus are in desperate need of an update, including the freshmen Towers and Ely, Allen and Brewster. Not only do those locations lack air-conditioning for the unbearable first few months of school, but housing in general is barely enough for half of the school.
For those students who live further away from campus and encounter unnecessary stress trying to find a place to live, more money could be put toward adding more dorms.
Safety concerns are also raised for anyone who has stepped foot in Forcina Hall, especially for those who have tried to use the elevators there. While some renovations are supposedly in the works, the building is simply too old.
As far as actual schoolwork goes, it is almost impossible to go one day without running into difficulty with the internet connection or lack thereof.
With many professors constantly making assignments due on Canvas and requiring papers which needresearch from online sources, it is beyond necessary to have reliable Wi-Fi — something the College desperately lacks.
The inevitable loss of the Brower Student Center — a central location for all student activities — is beyond inconvenient, as well. Many clubs and organizations use the space to hold meetings and practices on a daily basis, something which they will struggle to do during its piecemeal renovation.
Without proper space, many are left to wonder where they will hold these events while the construction is ongoing.
Students are also losing two of the most popular meal equivalency locations — the Lions Den and the Rathskeller, the latter of which will soon be gone permanently.
Instead of renovating the Student Center — construction not set to complete until the fall of 2017 — the millions of dollars being spent should be used on projects that are in more immediate need of attention.
In lieu of several states recently passing “religious freedom” laws, the question of discrimination and individual beliefs have become increasingly entangled.
Just last week, Indiana became the 20th state to adopt a “religious freedom restoration” law, thus allowing businesses to refuse service to people if they were deemed to be going against one’s beliefs.
These laws are unjust and specifically target the LGBTQ community.
Supporters of the law have noted bakers who refuse to make a cake for a gay couple’s wedding and florists who refuse to sell them flowers. It is abundantly clear that the law aims to give those bakers and florists a way to fend off inevitable lawsuits.
According to CNN, the law came after an “outcry from social conservative circles over incidents where business owners found themselves in hot water after refusing services to gay couples planning to get married.”
But what about the couple who simply wanted a cake?
It is 2015. How does making a cake for two people who love each other — regardless of their gender — go against religious views? How is that thought so much to bear that an individual can’t stand to make a simple cake?
Though Indiana Governor Mike Pence, who passed the law, has assured that it is not about discrimination, many are still up in arms, and some have even threatened to boycott businesses in the state.
“This bill is not about discrimination,” he said. “And if I thought it was about discrimination I would have vetoed it.”
However, according to CNN, “civil liberties and gay rights groups assert that the law could be used by businesses to deny service to people based on their sexual orientation and justify that discrimination on their religious belief.”
It is ridiculous how some people are so narrow-minded as to feel the need to deny others basic liberties because they feel personally offended. Religion should not be used as an excuse for a battle fought years ago — equality.
In modern society, it is mind-numbing that individuals still can’t simply accept one another for who they are. It is even more alarming that state legislators allow such prejudicial bills to pass with flying colors.
While Indiana is not the first state to pass such a law, Adam Talbot, a spokesman with the Human Rights Campaign, a gay rights group, stated how those other states’ laws are “dramatically different in their scope and effect … Indiana is the broadest and most dangerous law of its kind in the country.”
The law states that governments can’t “substantially burden a person’s exercise of religion,” and those who feel that their religious beliefs are being impended upon can use the law to fend off lawsuits.
But what exactly a “substantial burden” is remains undefined by the law.
It is, presumably, up to the judge to decide when a case comes into play. Again, a major hole in this law. Each case could be seen differently all depending upon which jurors are assigned to hear it.
It is atrocious that laws are now being used to defend such childish behavior. People are people, no matter to whom they are attracted. A man liking a man or a woman liking a woman does not define who a person is. Someone loving someone else does not define who they are, and laws which give others the right to refuse them service is, in itself, an injustice.
College fraternities all across the country are making headlines, from Penn State to the University of Oklahoma to Dartmouth College. Their misdemeanors are nothing short of deplorable, and the media storm swirling around them is more than warranted.
Still, the question remains: Are the repercussions strong enough?
The public seems to think so.
Members of the University of Oklahoma’s Sigma Alpha Epsilon fraternity appeared in a viral video in the beginning of March, joining in a racist chant with references to lynching. The response was immediate: Within hours of the video leaking online, the University closed the chapter and forced members to leave the fraternity house. Later, two of the students leading the chant were expelled and have since made public apologies for their actions, according to the New York Times.
University of Oklahoma was lauded for its swift handling of the scandal. It denounced SAE and expelled the chief offenders from the University, much to the satisfaction of both the black student union on campus and the general public.
Admirable, yes. But it leaves something to be desired.
It recently came to light that the Alpha Delta fraternity at Dartmouth College, which has been suspended since March 2014 for hazing violations and alcohol-related charges, would be suspended until 2018 for branding its new members, according to ABC News. The suspension was intended to be lifted on March 29, 2015, but this new piece of information prompted the college to extend the ban for another three years.
And at Penn State, the already-suspended Kappa Delta Rho chapter landed in hot water after their “secret” websites containing photos of naked, unconscious women in seriously compromising positions were discovered, according to the New York Times. Some of the women in the photos are contemplating pressing criminal charges.
Again, these universities took action and did what could be done. But it still isn’t enough.
These are isolated instances of hazing, racism and misconduct. Universities can naturally only do so much when handling these cases. They can expel students, shut down fraternities and even press criminal charges, but they can’t get to the source of the problem. That has been proven countless times.
The Greek life culture needs to change for that to happen.
If the national organizations made it clear that this kind of behavior was inexcusable; if the members themselves enforced more stringent codes of morality; if pledges weren’t harassed and hazed just because it’s part of a long-standing tradition, maybe this wouldn’t happen.
It’s 2015. This kind of behavior can’t be tolerated anymore. The perpetrators of these sickening acts shouldn’t have to be expelled to understand that it’s wrong to belittle other human beings or to take advantage of them simply because that’s how these things have always been done.
However, not all fraternities behave like this. Some haze, but not to the extremes the media reports about. Some are strictly just having fun with the new pledges, and nothing wrong with that.
It becomes a problem when that thin line of “just having fun” and terrorizing the new brothers is crossed.
Despite the philanthropy work fraternities do participate in, it becomes difficult for many to support them when constant cases of horrifying injustices are heard. After all, extreme cases are the ones that make headlines.
Then again, not all fraternities get caught, thus raising even more questions.
It is going to take serious time for national organizations to implement changes that will truly take effect and stop extreme acts of hazing once and for all.
No matter what the answer is, one thing is abundantly clear — the Greek life culture is in desperate need of change.
By now, I’m sure everyone has heard about the dress. That’s right, the infamous picture causing friendships to be ruined and people to question their own eyesight. But how has one picture of a hideous dress managed to spread around the world?
On Thursday, Feb. 26, the world was suddenly divided.Tumblr, Twitter, Facebook, Instagram, YikYak and Youtubebecame plastered with debates about the true color of a dress — some saw white and gold while others saw blue and black.
Walking into Eickhoff Hall was like stepping into a war zone — the heated arguments from each table rang with screams of “white and gold” vs. “blue and black.” In the past week alone, reports have surfaced regarding the picture’s origin, simply causing more debate.
The picture was first posted on Tumblr by Caitlin McNeill, a 21- year-old from Scotland, according to businessinsider.com. She explained that the dress, which is blue and black, was worn to a friend’s wedding by the mother of the bride. Debates about the color first began when the now famous photo was sent to the bride.
“When my friend showed the dress to her fiancé, they disagreed on the color,” McNeill said in the same article. “All of our friends disagreed.”
The picture was then posted on Facebook and Tumblr, where others began to comment about the color. It didn’t take long before it took off virally.
People the world over are now obsessed with the dress, prompting psychological analyses and coverage in top-tier publications like the New York Times. But the dress obsession has highlighted an underlying issue — the fragile influence of social media.
In the past few years, we have seen a drastic rise in the amount of people tweeting and posting to other media outlets. All that it takes is one picture to get a few retweets, and suddenly it’s a viral hit. However, there is a thin line between virtual obsessions and reality.
According to Forbes.com, Internet Use Disorder may soon be listed as an actual mental health disorder. Psychologists at the University of Albany have recently found that social media itself is not only addictive, but those who use it excessively may be at a higher risk for substance abuse, according to a Huffington Post article from December 2014.
Now, that’s not saying that a picture of a lousy dress is going to cause a rise in hard-drug use. It’s simply the principle of the matter.
As a society, we are addicted to social media and often latch onto new ‘stories,’ yet we rarely see the harmful side effects of overusing technology. Of course, there are benefits of news being delivered within a moments notice via Twitter, but people must also realize that constantly having their faces locked onto screens is unhealthy to say the least.
It is mind-numbing how a picture originally posted on Facebook not only spread within a week around the globe, but also caused such uproar and heated arguments between friends — all regarding what color each individual saw.
Scientists have even stepped forward offering their opinions on the apparent color differences seen by many. According to USA Today, those who see blue and black are seeing the photo as overexposed, while those who see white and gold view it as underexposed.
“Color is our perception — our interpretation of the light that’s in the world,” said Arthur Shapiro, a professor at American University who specializes in visual perception, in a USA Today article.
No matter what color the dress truly is, the fact that it spread so far so quickly and caused such fierce debates, is proof of the power of social media.
While there are definite upsides to social media, the negative effects must be addressed, so individuals can be wary of their use.
Social media has drastically become an obsession, an outlet highlighting the often blurred line of virtual worlds and reality.
Many often question whether boys or girls perform better academically, and some may even wonder who typically excels when it comes to extracurricular activities. But is the categorization of the two genders to blame for a recent study claiming that females outperform males in the three core subject matters?
According to a US News and World Report article from Thursday, March 5, a study from the Organisation for Economic Co-Operation and Development shows that male students are more likely than female students to underperform academically and thus hurt the future economy.
This raises concern over the often cruel gender stereotypes.
For years, it was argued that girls were just meant to be at home, taking care of the kids and household chores, while men were allowed to further their education by attending college because of their gender and “mental superiority.” Now that females are also widely attending colleges and such absurd accusations have been disproven, are women in fact better at certain subjects than men?
Of course not.
The article details how the study showed a 19-point score difference between girls and boys in mathematics. Girls were more likely than boys to have “lower self-confidence in their math skills and (were) more likely to feel anxious about math.” It was also noted how those tendencies extend into college, as well, with 14 percent of females who began college in 2012 choosing a science-related field compared to 39 percent of men.
It is as if girls are taught from a young age that doing well in science and math is a bad thing, thus affecting their career goals. Boys who choose to major in science or math, however, are viewed as “intellectual,” and oftentimes, more worthy than girls.
This has got to stop.
The study highlighted that boys spend less time and effort on their homework due to videogames and other entertaining hobbies, negatively affecting how they perform in school. However, no mention was made about females being preoccupied with extracurriculars, as well.
Once again, boys are viewed differently than girls and are being categorized by their supposed hobbies. These norms are learned from a young age, whether it is realized or not. Ultimately, such views have an effect on how children grow up, and what they decide for their future.
Young girls play sports just as young boys do, and they often participate in the same games, especially at a young age. More and more, girls are seen breaking the barrier of sports often deemed for boys.
In Pennsylvania, for example, sixth graderCaroline Pla has been playing football with the boys since she was five, and as the only girl in her division, 11-year-old Sam Gordon has outperformed the boys on her youth football team, according to U-T San Diego and ESPN, respectively.
A video of Gordon went viral last year, showing her dominating other players. Since then, she has been a representation of girls breaking into male-centric sports.
What gender somebody is no longer defines the achievements they can reach — neither academically nor in extracurriculars. It is important that, as a society, the value of an equal education, without any gender stereotypes categorizing someone, is taught from a young age.
Both women and men are equally smart and capable, and there is no accurate study that could possibly show how one gender outweighs another.
Even though the anticipated 2016 presidential election is still well over a year away, it already has a buzzword vividly circling — transparency.
“Transparency matters,” former Florida Governor Jeb Bush tweeted on Monday, March 2, in response to a crop of scandals connected to Hillary Clinton. Bush was adamant that Clinton, a frontrunner for the Democratic Presidential campaign, release her private emails that she sent and received during her tenure as Secretary of State to the public for the sake of all-important political transparency.
While Clinton did ultimately cooperate and announce that she had asked the State Department to go ahead and release her emails, it didn’t exactly quell the public outcry for total honesty.
Beyond her truly controversial decision to use a private email account for government affairs, Clinton faced scrutiny for her family foundation’s acceptance of donations from Middle Eastern countries that suppress women’s rights. Both of these recent scandals require Clinton to be perfectly transparent about the ethics of her actions.
But is that ever enough?
It’s not just the blatant issues themselves that will naturally plague Clinton’s campaign — that is if she doesultimately announce her candidacy. Her response to the issues will define her campaign, and could make or break it.
Take a look at New Jersey Gov. Chris Christie, whose dreams of a lofty presidential nomination are fading fast. Some might say that Christie is too transparent, too brash and direct in his response to questions from opponents. An article from the New York Times on Thursday, Feb. 26, quoted Christie as saying, “Sometimes people need to be told to sit down and shut up.” Does Christie own up to his mistakes, such as his involvement in the Bridgegate scandal of 2014? Perhaps, but his blunt honesty isn’t appealing to many voters.
Political consultants, such as Patrick Davis from Colorado, believe that “Christie’s brashness may work for New Jersey voters, but he did not think it would play well in Iowa, site of the first presidential nominating contest,” according to the Times.
Similarly, claims of any type of transparency may turn out to be hypocritical. Jeb Bush’s call for Clinton to release her emails preceded the revelation that he took seven years to release his own private emails to the press, according to the Times.
It seems suspicious that Bush waited until long after leaving office to release his emails but has taken the last few weeks to publicly pride himself on his own political transparency. How he deals with this new revelation could seal his fate in terms of a nomination.
While Clinton held a press conference to discuss the email controversy, among other pressing issues as well, journalists did not note her willingness to address the problem up front. They did not laud her for her honesty, or for her decision to make her private emails public information. Instead, the Times noted that she held a “defensive” stance when taking questions from reporters, and that “It had taken eight days for Mrs. Clinton to make herself available for questions. And long before the questions ran out, she began packing up her binder.”
Transparency is, with no doubt, a bipartisan issue. No politician is immune to the occasional slip-up, as the 24-hour news cycle makes abundantly clear. But the universal cry for transparency makes worthy presidential candidates ultimately seemincompetent or evasive.
Isn’t it time that we accept that our politicians aren’t always forthcoming about their mistakes, like any human being? And isn’t it time that we focus on issues that are more important than email records and bridge lane closings?
The upcoming 2016 presidential campaign will likely hinge on the candidates’ transparency, but what it should focus on is their stances on the hot-button issues.
After all, their own competency in addressing those issues isn’t contingent on how they handled minor mistakes in the past.
By Melissa Carter President of Vox: Voices for Planned Parenthood
Twenty years ago, in 1995 at the World Conference on Women in Beijing, governments made a promise to women and solidified the concept that so-called “women’s rights” are about much more than women.
The conference marked a truly significant and vital turning point for gender equality, as 189 governments each signed a progressive blueprint for advancing women’s rights — and justice for all. Since then, we’ve seen a number of global milestones marking progress in achieving justice for all. Ellen Johnson Sirleaf became the first female president of an African country. The United Nations now formally recognizes the human rights of LGBTI people. A safe abortion protocol was enacted for the first time ever in Peru. The Green Belt Movement in Kenya has now empowered thousands of women to conserve the environment. The list goes on and on.
Today, the fight for women’s rights looks nothing like it did when our mothers and grandmothers were fighting it. Feminists from Malala Yousafzai to Janet Mock, from Emma Watson to Planned Parenthood Youth Peer Providers across Africa and Latin America, are revolutionizing the fight for women’s rights. And all of them have one thing in common: They are all 30 years old or younger.
At the organization Planned Parenthood, young activists just like us believe in a world where health has no borders. From New York City to Guatemala and beyond, our young health educators, activists and providers are committed to working with many different communities to ensure that everyone, regardless of race, class, nationality or gender, has access to the healthcare they need. They provide information, clinic referrals and even condoms to thousands of people. They meet with government representatives and organize campaigns to make sure that leaders are accountable so that no one is left behind, when it comes to access to sexual and reproductive health.
This International Women’s Day, we stand strong with our fierce allies in shaping the most diverse movement for women’s rights yet. We are now doubling down on our commitment to guarantee that our government does its part to completely fulfill its 20-year-old promise to women.
Want to join us? Spread the news about where things stand for global women’s rights. Tell your senator to support the Global Democracy Promotion Act, which would benefit women and families around the world by ending the global gag rule and is expected to be reintroduced in early March in honor of International Women’s Day.
Like any other job seeker, one researches the company who he or she is applying for and any possible interview questions the employer may ask. There are standard questions that can typically be expected like, “Tell me about yourself” or “Why do you think you are a good candidate for the job?”
I went on an interview the other day, and one of the questions that surprised me was, “Describe one person that was different from you that you met since coming to college. How did they change you?” I always replay interviews in my head to reflect on what I can improve on. I thought a lot about this question after the interview. It was not until almost a week and a half later that I found the answer from six amazing girls.
This week at CAPS Peer Educator’s National Eating Disorders Association monologues, not only did I hear women speak about their eating disorders, but I heard them speak about their personal power, showing that they are strong individuals who can now overcome anything. By looking at these women, you would never guess the difficult battles they overcame with not only their mind, but their body, as well.
I have to admit, some of the women that spoke surprised me because I knew them. I did not expect to see them there, let alone speaking. Even knowing them and being in organizations with a few of them, I would have never been able to tell what they were going through.
Many people, including society as a whole, make seeking help for an addiction or disorder a negative thing. However, in reality, it takes a lot of bravery and courage to take those steps. Even being a best friend or family member, where you think each of you know his or her deepest darkest secrets, you may never know what a person is going through. It may not even be on purpose, but they may be living in denial with him or herself.
If I were to answer the question, “Describe one person that was different from you that you met since coming to college” again, I would have answered it like this. I had the honor of hearing six incredible women speak about their eating disorders. These powerful women were some of which I see on a daily basis and thought I knew, but as it turns out, I did not really know them at all or what they were going through. Everyone is taught, “Don’t judge a book by its cover,” but not everyone is able to connect with it. After attending the NEDA Monologues and hearing their stories, I learned that nothing is ever as it appears. Never make assumptions, and most importantly, always be kind.
“Every single empire in its official discourse has said that it is not like all the others, that its circumstances are special, that it has a mission to enlighten, civilize, bring order and democracy, and that it uses force only as a last resort.”
So wrote the late and deeply missed Edward W. Said in the 2003 preface of his renowned book, “Orientalism,” first published 25 years earlier in 1978. What concerned Said in “Orientalism” was the recurring imagery of the so-called “Near East” (so-called because Said never viewed the terms “West” and “East” as ontologically and epistemologically stable — the regions were rather “imagined geographies” based on a certain historical perspective).
For Said, there was nothing objective about our knowledge of the Middle East; it was created, maintained, financed and pursued for exercising power and control over a demarcated region and people.
“Orientalism,” then, was a corpus of knowledge, scholarly and now today part of the mass media, designed to craft a distinct image of Middle Eastern people — the strange, menacing and unreasonable Arabs, the Islamic suicide bomber, the sensual women of the East — all corresponding as threats posed to the “civilized” West, with its emphasis on democracy and freedom.
These social constructions were necessary fictions in order to create public support first for British and French colonialism; and second, at the conclusion of WWII and dawn of American Empire, for an expansive and hegemonic American foreign policy. Today, “Orientalism” still remains an integral part of U.S. international affairs rhetoric and delimits the conditions for possible statements, what French philosopher Michel Foucault called episteme, about the region as a whole.
In a recent speech to Congress, for example, Israeli Prime Minister Benjamin Netanyahu conflated Da’ish (ISIS) and Iran, as though they were two sides of the same coin.
“Don’t be fooled. The battle between Iran and ISIS doesn’t turn Iran into a friend of America. Iran and ISIS are competing for the crown of militant Islam. One calls itself the Islamic Republic. The other calls itself the Islamic State. Both want to impose a militant Islamic empire first on the region and then on the entire world. They just disagree among themselves who will be the ruler of that empire,” Netanyahu said.
Is there similarity between a small band of brutal militants who have seized control over incongruous sections in Iraq, routinely committing acts of heinous slaughter, and a nation-state who has signed the Nuclear Non-Proliferation Act and whose Supreme Leader, Ayatollah Ali Khamenei, issued a fatwa against nuclear weapons?
What Netanyahu has effectively done is essentialize two fundamentally different groups of people on the single commonality that they generally belong to the religion of Islam. Ignore the fact that Iran decrees Shia Islam as its official religion in direct opposition to Da’ish and its radical version of Sunni Salafism, a vision no country in the region sees as legitimate.
Disregard the fact that Iran is currently leading the fight to retake Tirkit with Iraqi troops and U.S. oversight. Blatantly ignore the reality that Iran is not brutally slaughtering innocent people, and instead, is the forefront opposition to Da’ish. Then, and only then, can the absurdity of comparing Iran and Da’ish emerge as a logical deduction.
Further, the commitment of Iran to a nuclear free zone is not as suspect as Netanyahu alleges. Gareth Porter in Foreign Policy offers a vivid example.
When Saddam Hussein used chemical weapons, primarily mustard gas and nerve gas tabun, to kill 20,000 thousand Iranians during the Iraq-Iran War, Mohsen Rafighdoost, the minister of the Islamic Revolutionary Guard Corps (IRGC) throughout the eight-year war, offered plans to create chemical and nuclear weapons in order to retaliate against Iraq. But the Islamic Republic’s first Supreme Leader, Ayatollah Ruhollah Khomeini, firmly persisted in his opposition to nuclear weapons.
“It doesn’t matter whether it is on the battlefield or in cities; we are against this. It is haram [forbidden] to produce such weapons,” he said.
Rafighdoost described his encounter again: “Imam told me that, instead of producing chemical or biological weapons, we should produce defensive protection for our troops, like gas masks and atropine.”
The great irony, of course, is that while Iran remained steadfast against using chemical and nuclear weapons, the U.S. intelligence community provided imagery and maps of troop locations for Iraq, fully aware of the intent to use sarin and mustard gas against Iran.
“The Iraqis never told us they intended to use nerve gas,” retired Air Force Col. Rick Francona told Foreign Policy. “They didn’t have to. We already knew.”
Orientalism has a distinct way of producing identities that are knowable, timeless and predictable—Netanyahu seems to say, “Don’t trust the guile Arabs, they are just waiting to trick us. They’re like all the rest.”
Netanyahu can make these outrageous comparisons because he knows the dominant framework — what Thomas Kuhn had called paradigm still remains Orientalism in America.
It is easy to dismiss the virulent racism of the early Western colonial project, proclaiming with a certain smugness, those men were “of their time” and that “we don’t have those problems anymore. However, that rhetorical gesture conceals rather than examines, the very real Orientalism that exists today — an Orientalism that garners standing ovations by the leading members of our Congress.
It should, of course, worry most individuals just how easily Netanyahu can come to America and speak with a historical Orientalist contempt against Iran and then be celebrated in American public discourse.
This is undoubtedly a strong act of war mongering, but it gains legitimacy through our own fears and prejudices, rather than through the truth.
It used to be that, for someone to become famous, they had to go through the right channels — auditioning, hiring an agent and a publicist, slowly and steadily gaining notoriety.
Now, all someone has to do to reach celebrity status is create their own YouTube channel.
The proliferation of social media has made fame — or rather, infamy — more accessible to the general public. Look at “Alex from Target,” for example. Before a teenage customer snapped a photo of the cashier and uploaded it to Twitter, he was just an average, 16-year-old Texas teenager. Within a week of his photo going viral online, he was fielding interviews from major news outlets like the New York Times and appearing on “Ellen.”
It’s easier than ever for someone to find the spotlight, through media platforms like YouTube, Twitter and Vine. Conversely, it’s harder than ever for celebrities to avoid the spotlight for the same reasons.
There are infinite examples of celebrities trying to shirk their fame, or at least hide from public scrutiny. Sia, the “Chandelier” singer who performs with her back facing the audience; Shia LaBeouf, who once donned a paper bag over his head bearing the message “#I AM NOT FAMOUS ANYMORE”; Demi Lovato, whose struggles with an eating disorder and self-harm forced herself to step back into the shadows for a period of rehabilitation; the list goes on and on. And yet their efforts to keep a low profile often backfire.
Celebrities are nothing new. But the idea of instant fame, or the way of someone’s private life being completely eclipsed by the demands of the public, is. And it’s extremely dangerous.
We live in a world where private information is freely accessible, where hackers can seize intimate photos from Jennifer Lawrence’s iPhone and post them online for everyone to scrutinize, where breaking news of Lovato’s eating disorder becomes a public forum and allows people to make cruel comments about her appearance.
“Just because I’m a public figure, just because I’m an actress, does not mean that I asked for this,” Lawrence said in a November 2014 interview with Vanity Fair. “It does not mean that it comes with the territory.”
No wonder Sia wants to shield herself from the public eye.
While celebrities are public figures and subject to a lot of media attention, there is a fine line between free expression and an invasion of privacy.
Alex Lee, better known as “Alex from Target,” didn’t ask for attention. Fame was thrust upon him when his photo hit the internet and began wildly circulating.
“I’m kind of scared to go in public,” Lee told the New York Times in November. His sudden fame was overwhelming, unexpectedand, quite simply, not always desirable. He would sometimes receive “death threats” via Twitter, and his family’s personal information, “including Social Security numbers, bank accounts and phone records,” were leaked online as well.
It’s natural for the public to latch onto a celebrity and to have a vested interest in both their public and private lives. But fame, while fleeting, is often cruel and unfairly revealing. The obsession with fame often leads to a breach of privacy and a breach of ethical conduct.
According to society, teenagers are often viewed as partying all the time and hanging out with their friends 24/7. A typical assumption of college students is that they are consuming illegal beverages and substances at all times without focusing on class work or studying. If you think this is how students live their lives, then think again.
According to a new U.S. News and World Report, college students are studying more and socializing less.
In 2014, the University of California, Los Angeles published The American Freshman Survey, data consisting of over 150,000 freshman participants currently enrolled in over 200 colleges across the United States. Compared to students from 1987 and now, current students spend less time socializing.
According to the survey, just 38 percent of students reported spending less than five hours with friends per week, while 18 percent said they spend more than 16 hours around others. In 1987, the majority of students said that they socialized more than 16 hours a week.
The report begs the question: why the drastic change?
Young adults are focusing more on getting good grades now as colleges become more demanding and selective as compared to 1987. Today, colleges have all-time low admission rates, including Ivy League schools. Stanford University, for example, has a 5 percent acceptance rate, and Princeton University has just a 7 percent acceptance rate.
Students in high school are conditioned to excel in everything, from acing the SATs to holding top positions in clubs and playing varsity sports, all to receive that precious acceptance letter at the best possible college. It is drilled into students from the instant they walk through those doors freshman year just how importantand limited their time in high school truly is. With the pressures of doing well in school and extracurricular activities, students have less time to socialize and instead learn to put their education first at all times.
“It’s required to have higher education for jobs now,” freshman open-options humanities and social sciences major Emily Loevy said. “Where in the past you used to need a master’s degree, now you need a PhD. It’s more competitive in the world.”
By the time students start college, the instinct of doing well and being involved on campus does not simply disappear. By having limited time to socialize with friends, the routine of focusing more on schoolwork becomes normal. Many students do not want to drink or do drugs because they want to focus on doing well in their classes.
“It’s so much money to get here that it’s a waste otherwise,” freshman nursing major Madison Lacken said.
Thinking back to those first few days of freshman year, everyone was excited to finally be here at the College. Doors in the hallways were always propped open, and people were constantly in and out of each others rooms. Then, classes started and almost everything changed.
“When classes began, I learned to balance my time better,” freshman computer science major Giacomo Corcione said. “I am now able to hang out with my friends a lot less.”
Gradually, doors began to slowly shut and many ran to the library to get their work done. The entire atmosphere seemed to shift within a day. Students love to have fun, but they know when the right time is to hangout with friends and when it is appropriate to buckle down to get work done on time.
“(In this semester), now that we have classes that have to do with our major, you have to pull yourself away from your friends to focus on individual schoolwork,” freshman open-options business major Holly Billand said.
While classwork is vital of course, it is still healthy to interact with friends. By studying in groups, an individual can be surrounded by friends while also being productive.
Contrary to what many may believe, students are devoted to their work and understand the importance of studying hard. Small steps may just be the answer to a better, strong balance of school work and socialization.
In the aftermath of last weekend’s blizzard, I watched helplessly as another student slipped on the slush but was too focused on her phone to even notice. As a society, individuals have become so desensitized to embarrassing blunders like this that witnessing distracted students jaywalk in front of cars or walk into someone without apologizing is a part of daily routine.
And yet I still hope that they will notice how rude they are being and pull their attention away from their phone in time to see what is happening all around them.
According to a report that was published in October 2014 by Safe Kids Worldwide, every hour there is a teen pedestrian in the United States who is injured or killed after being hit by a car. Of teens who have been hit or almost hit while crossing the street, 47 percent were reported to be listening to music, 18 percent were texting and 20 percent were talking on the phone, according to the same study.
These numbers are too high.
The statistics highlight an alarming amount of people who are downright obsessed with their phone. Next time you are in Eickhoff Hall, look around at everyone eating. A majority of people who eat by themselves cannot stand being alone, so they turn to social media, games or texting, all of which suck them into a virtual world so the real one seems less lonely.
There are also those who eat in a group but cannot pull themselves away from their phone long enough to join in on the conversation. With their eyes glued to their screen, these people miss contributing to great conversations, developing social skills and bonding with new people.
This obsession can also hurt friendships. While an individual vents or needs consoling, they are often ignored and possibly convinced that their problems are not important enough to discuss in conversation. People become so absorbed in checking their YikYak that they unintentionally neglect their own friends.
Oftentimes, when someone even travels, they have become so captivated by taking selfies or snapchatting they forget to appreciate the new environment they are in. Foreign countries are remembered by photos on a screen, not by having experienced the sights of being there. Even at concerts and parties, the priority for many has become taking a ton ofpictures and videos, not enjoying the new surroundings.
It is important for people to remember to take a step back from the phone and stop trying to document every little detail rather than live it. One should not deal with awkward moments or intimidating situations by turning to their phone to avoid learning how to cope. Facing real places, people and even emotions helps someone mature. Inhibiting that growth in this pivotal point in our lives will only make the transition into adulthood more difficult than it already is.
As Edward Norton’s character Mike Shiner said in the Oscar winner, “Birdman,” “Stop looking at the world through your cell phones. Have a real experience.” I could not agree more.
Unless you have lived under a rock for the past few weeks, you have probably heard about the Measles outbreak. While the cause of the recent epidemic is loosely related to Disneyland in California, a lack of vaccinations for the disease is strongly to blame.
The disease, which causes a fever, sore throat and rash, is one of the leading causes of death for children worldwide. With a reported 644 cases from 27 states, this is the largest outbreak in the U.S. since Measles was declared eliminated in 2000 by the Center for Disease Control.
While the true cause of what sparked the string of cases is unknown, if those kids were vaccinated as doctors recommend, they would not be fighting for their lives. There is no reason in the 21st century for people to not vaccinate their kids for diseases that can so easily be spread and are heavily known to cause serious harm.
In December 2014, it was first reported that those with Measles could be traced back to Disneyland, with 42 of the state’s 59 cases at the time having been linked to the park. Nine other cases from those living outside of California have also been traced back to Disneyland, according to CNN.com.
Many fear the vaccination because they believe it causes neurological disorders such as autism. Claims that supposedly link the Measles vaccination to autism are not only absurd — they are unfounded.
Scientific evidence has found no link between the Measles, Mumps and Rubella (MMR) vaccine and the “onset of developmental disorders such as autism,” according to healthmap.org. Studies that have been done show that the average age where some children may begin to show symptoms of autism is coincidental to that when the vaccine is typically given. There is no direct link between the two, according to PolitiFact.com.
So, even when scientists have proven these claims to be false, why is the belief so strong? In 1998 a British medical journal, The Lancet, published a paper by Andrew Wakefield claiming a link between the MMR vaccine and autism. Upon further examination, it was revealed that his study only included about 12 children, some of the work was faked and that he was “paid by lawyers for parents of children in the study,” according to PolitiFact.com.
But the damage was done.
Wakefield started a firestorm which spread around the globe and which many still value to be true. There is no reason for anyone with an open-mind to still believe the lies of a discredited medical researcher. Those who are not vaccinated for fear of other diseases are being ridiculous, especially when they have proven to have a false correlation.
In 2010, model, actress, author and anti-vaccine activist Jenny McCarthy wrote an article for The Huffington Post stating her views on why vaccines do, in fact, have a correlation with autism. She stated how a Time magazine article on the autism debate claimed that experts are certain that “vaccines don’t cause autism.” However, she refuted this by saying, “That’s a lie and we’re sick of it.” Notably, McCarthy has a 12-year-old son with autism.
When public figures begin spouting these unfounded claims, the general public is more likely to believe them. Product advertisement is pure proof of this. Society has an infatuation with celebrities, and when they start preaching about something or promote a specific product, others follow them.
The recent outbreak has been largely blamed to the anti-vaccination movement sweeping the nation. Since Thursday, Jan. 1 alone, there have been at least 121 cases reported in 17 states. Earlier in 2014, there was an outbreak among unvaccinated Amish communities in Ohio, where 383 cases were reported, according to cdc.gov.
There is a certain understanding that those who do not believe in vaccinations for religious beliefs have a right to their views, yet when those beliefs begin to negatively affect the larger community, it is a problem.
Unless a direct link is found to which the MMR vaccine causes serious harm, there is no reason why every child is not given the shot.
It’s the magical day of red roses, oversized stuffed teddy bears and everything chocolate. Most people clear their schedules for this romantic day in February, having spent days, or oftentimes weeks planning ahead for it. But with all the stress of Valentine’s Day, is it simply an overrated holiday, or does it actually hold significant meaning?
There are 365 days in a year, and instead of just waiting until Feb. 14 to shower significant others with love, it should be done all year long. Many people rush to the florist or the jewelers to prepare for this special day, but there is nothing wrong with buying someone heart-shaped chocolates in July or October. Love should be expressed year round, not just on this one day.
Now, I am not saying I hate Valentine’s Day. I love it, and I think the holiday has good intentions. It’s a day to show loved ones how much they mean to you. Whether you’re single or not, everyone has someone important in their life. I simply hate the hype of showing affection for a loved one like it is an unheard of affair.
Valentine’s Day has become a commercialized holiday, almost taking away its significance. According to CNN.com, an estimated $18.6 billion will be spent on the romantic holiday each year. $1.6 billion will be spent on candy, $1.9 billion will be spent on flowers and $4.4 billion will be spent on diamonds, gold and silver. All of this going into the pockets of companies looking to make money, not for the price of true romance.
Individualsdish out ridiculous sums of money to spoil their loved ones with beautiful lockets and the finest chocolate covered strawberries, all in order to give them a magical day. To show someone that you love them, it shouldn’t require reservations at the fanciest restaurant or buying the most elegant jewelry. All of your love for someone shouldn’t be squeezed into just one day, with one set of gifts. Love should be expressed all the time, and not just with material items.
Walk into any department store the week after Christmas, and already you’ll be overwhelmed by the amount of red, pink and heart-shaped items lurking on the shelves. Come the beginning of February, every other commercial on TV will be love-focused or Valentine’s Day related. This is all fine, but it puts such an emphasis on what you should buy for your loved ones instead of what you can do for them to show you’re thinking about them.
Most people would love a personalized song, poem or even a card. All are simple gestures that are different than a typical, generic store-bought one. Plan a day trip to their favorite spot for a unique adventure — do something they’ll remember. Flowers will die, chocolates will spoil and jewelry can easily get lost, but good memories won’t ever fade.
Material items are sweet to receive, and there’s nothing wrong with giving them, it just shouldn’t be the sole focus or overdone. There’s only so much a giant teddy bear or a charming bouquet can say as opposed to a caring action.
Don’t stress yourself out too much over Valentine’s Day. You have 364 other days in a year to show your loved ones just how deeply you care, and you don’t have to break the bank to pamper them.
The College of New Jersey Student Newspaper Since 1885