Opinion

For better or worse: Breaking Down the Barriers

As UCSD students begin yet another quarter at our fine university, they will endure a second onslaught of tests, midterms and finals. Amid all the drama and excitement that typify college, I begin to wonder how many students take time to truly reflect on the process in which many of them were granted admittance. Frostenson Guardian Because of the passing of Proposition 209 a few years back, many of today’s students have not had to worry about dealing first-hand with the effects of affirmative action. Since 1995, the debate over the legitimacy of affirmative action has raged, fueled mostly by conservatives such as Pete Wilson, Dinesh D’souza and Ward Connerly, though many so-called liberals, too, have been vocal members of the opposition. Many opponents of affirmative action in California bring up the familiar argument that such practices are racist and support a form of “”reverse discrimination,”” wherein certain more qualified students get denied admittance to UC schools over less qualified minority students. They go on to argue that even though admissions boards may no longer maintain specific racial quotas when deciding who to accept and who to deny, they are still unfairly influenced by a racial agenda. The case of Regents of the University of California v. Bakke in 1978 is often cited by affirmative action critics to support their stance. In it, Allan Bakke sued the University of California for twice denying him admittance to the UC Davis School of Medicine. According to Bakke, although his test scores were higher than many of the minority applicants the school admitted, he was still rejected because he was white. Ultimately, the case went to the Supreme Court in which the justices ruled 5 to 4 that UC should admit Bakke. The decision further states that schools cannot use racial quotas in determining who they accept. However it leaves open the practice of considering race in admissions to encourage diversity. While I agree with the notion that students should be judged by their own merits, I also recognize that we do not live in a perfect world — that is, judging students on an even playing field, the fact of the matter is that everyone is not on an even playing field. To assume that would be a naive and misguided attempt to ignore all the inequality that exists in our society today. This is where the case against affirmative action is weakest. Too often affirmative action has suffered from being overly scrutinized for the moral and ethical issues that it raises. What is often forgotten in the debate are the systems of discrimination that created a need for such a program in the first place. For opponents to suggest that admissions boards judge a person solely by their own merits as if we were all equal would be a dire mistake; it conveniently implies that our society is and has always been cured of the prevailing disease known as racism. Unfortunately, this simply is not the case. Historically, groups of people have been disenfranchised and marginalized in numerous institutions exclusively on the basis of their race. Affirmative action should be seen more as a necessary, albeit flawed, remedy for creating greater opportunities for groups who normally experience considerable inequalities.. Without a program like affirmative action, we would have a system that only serves to further polarize and segregate our society. In many ways, this is precisely what has happened over the past few years at UC campuses, as well as colleges across the nation. Minorities in general are increasingly absent from most student populations, particularly in elite colleges. UC Berkeley and UCLA have suffered from a lack of more diversity as a result of Prop 209 and its subsequent abolishment of affirmative action in state programs.. UCSD is definitely no exception in this regard. At a recent Graduate Diversity Conference hosted by UCLA, Patricia Gurin, a professor at the University of Michigan and one of the state’s expert witnesses in an affirmative action suit against the university, expressed the crucial role programs such as affirmative action play in society. “”Knowledge is not just what you think or what you know, but how you think and how you discover new information. This is affected by diversity,”” Gurin said. Diversity should not necessarily be the narrow goal of affirmative action, but more of a positive byproduct. The true goal of affirmative action and the reason it should be reinstated in the UC system has always been social justice. We can no longer live in a “”color-blind”” society, pretending that race and socio-economic status do not matter in today’s America. It is time we remind ourselves that race does matter regardless of how much we buy into the lie that we are all equal, or at least treated as such. Affirmative action, or some form of it, will always be necessary so long as we live in a society that continues to create inequalities between a majority that is content to maintain the status quo and a minority that struggles to conform to the mainstream majority social and cultural faction. ...

Author Offers Tips for UCSD Freshmen

Lawdamercy, 10th week is upon us like stink on a York bathroom, which is to say it’s already halfway down our throats before we have time to think about gagging. Those who have done their time at this fine university know whereof I speak, and those that do not are frosh. But worry not, oh freshly ones, for you too will taste the fetid joy that is a York Hall chemistry lecture. But are there other joys of UCSD living you maybe missing? Not if you follow the following handy-dandy, ultra-indispensable,stick-in-your-pocket-for-when-you-want-to-save-your-gum list of The Top Ten Things All Frosh Must Do By The End Of The Year! It may be too late in the quarter for some of these things, but don’t get discouraged, you still have winter and spring. Numero uno: Get your athletic money’s worth. Go to a damn sports game. College isn’t college without painting yourself with zit-causing greasepaint and screaming at some poor out-of-town team that’s getting utterly crushed, and our fair Tritons do an amazing job of regularly providing us with a chance to do just that. But that’s not enough. You also have to join an intramural sports team. I mean you have to. Even if you join as a cheerleader or goal post, get out there for the informal sports fun. And you have to use RiMAC. Or rIMAc. Or whatever it is. At least because we’re all paying out the ass for it. And play a pick-up game of wall-ball with me next to the ATMs in the Student Center. Numero dos: Get your groove on. You have to go to enough parties, no joke, and at least once get puked on, puke on someone else, watch your crush puke, or something along these lines. This is college, folks. It’s gross, but it’ll prepare you for dirty diapers. You also have to actually participate, at least once, in substance use. Don’t break any laws I can get blamed for. Be the sober baby sitter for a bunch of drunks/trippers/what-have-you, if staying straight suits you. Personally, I love sobriety, but I never knew how much so until I used drugs. Numero tres: Get educated. When you are sure you are sober, go to your professors’ office hours. You are peeing money into the gutter if you don’t. These academic idiots and geniuses are on your payroll (if you ignore the huge state subsidies). Bringing homework or questions about a test does not count. Go to talk about ideas and information, not grades and exams. Remember the word “”learning?”” Numero cuatro: Get the hell off campus. TJ with your suitemates only counts if it’s in the daytime. Downtown San Diego counts if you talk to a street person for more than five minutes. If you go to Planet Hollywood, fugedaboutit. The point here is to experience real off-campus culture, not prepackaged consumer goods. Going somewhere in La Jolla doesn’t count, ever, unless it’s $1 beer night at Karl Strauss and you sneak in and give a big tip to my housemate (the buxom blonde with freckles whose name starts with “”A””). Chula Vista counts all the time. So does Joshua Tree. I recommend the excellent services of the Outback Adventures Office, located right behind the notoriously hard-to- find Roosevelt College. Numero cinco: Find Roosevelt College. Find all of them, you putz, but especially Roosevelt, ’cause those kids are lonely … and often get lost trying to find their way home. Numero seis: Get to know your government. The A.S. Council takes around about $20 every quarter from you, me and that hottie whose window you tried to peek in last week (you thought no one could see you?). It ends up with a million or so of our money, yours and mine, and they spend it … on us? In theory, yes, and actually this year’s council seems to be doing a good job of it, but how would you know? Have you ever gone to a meeting? Numero siete: Get religious. College exists for experimentation, not just in the lab or between the bed sheets, but deep within yourself. If you have any religious background, try the services offered for your faith, at least once. If you’ve got nothing in the way of religiosity, sample around! Make sure you’re not missing something! This campus is an all-you-can-eat spiritual buffet, and you can get up from the table any time. Numero ocho: Get a job. Learn how to work for money while you study for grades. Feel lucky if you have the good fortune to be able to choose not to. Numero nueve: Get exposed. This does not refer to getting naked on Black’s Beach. Challenge yourself here, by broadening your social horizons. Go to an LGBTA dance. Go to a Rush party. Try the Shinai stick fights, if you missed the Darkstar Halloween Orgy. Try your hand at radio at KSDT. Kiss a custodian. And try the $4 all-you-can-eat vegetarian food at the Che Cafe on Thursday nights. Numero diez: Get exposed. This does refer to getting naked on Black’s Beach. ...

Columnist Fails to Be Hip After Much Effort

I’ve always known I would never be a part of the “”in crowd.”” My physique, my clothes, even my dirty, chipped nails suggest I would never quite fit in with the cool, hip students at UCSD. Until now, I was comfortable with the fact that I had no idea how much a pair of Gucci leather boots cost (although I can accurately hypothesize that they probably cost more than every hair removal treatment I’ve ever had). Yet, something — or should I say someone — ruined it all. A houseguest with the exotic name of Aida (after Verdi’s opera) entered my life and threatened to change it forever. The problem I have with beautiful, energetic and intelligent women is that they are extremely annoying. The reason is not just that they usually get every single guy who happens to glance their way, but also because they unintentionally (sometimes, at least) make me realize how inadequate I am as a woman. Alright, so I’m jealous. But wouldn’t you be if you had a houseguest who not only hung out with Axl Rose a few weeks ago but also manages to squeeze into tight jeans without one single ounce of fat bulging at the creases? Everyone has an Aida in her life. In fact, everyone should have one. They’re happy, bubbly people who also happen to have tons of guys running after them at every opportunity. Yet, what happens when you’re not the Aida? What happens when you’re just a shadow of an Aida? One answer: Wallow in your misery. Until this New York emigre entered my household, I thought I was extremely hip. Fine, so I’m not being totally honest. However, I still felt like I had the potential to be part of the “”hip”” crowd, the crowd everyone yearns to belong to but which requires too much time and energy for the average person. Nevermind that I’d have to lose 15 pounds (although some may argue 30), have a flawless face (courtesy of a lot of too-expensive foundation), and also know the difference between a Fendi bag and a Prada bag. I still believed, one might argue vainly, that I had a chance. That was until a truly hip guy magnet who gets into every L.A. club like a hot celebrity entered my abode and created my identity crisis. Perhaps I’m being immature. Aren’t college students supposed to see beyond the immature games high school students play to see who is the prettiest, skinniest, tallest, etc.? Still, I can’t help thinking that all of you readers have felt inferior to someone either on this campus or off at one time or another, so therefore I excuse myself for making foolish comparisons to someone who is much “”cooler”” than me. My mother may argue that I’m a jealous idiot, although she probably wouldn’t put it in such vulgar terms, but I feel like declaring war against this 24-year-old, whose charm managed to get her into a rock star’s home, to get free coffee at Starbucks (while I suffer and get only what I pay for) and to snag guys’ phone numbers like a spider does flies. I thought I could learn from her and become “”hipper,”” in hopes that I too could lure attractive young men (psychos need not apply) while still holding onto a shred of dignity. I listened as she told me how to play the game (“”Act confident, girls, and don’t forget to be respectful!””) and watched in a mirror as she completed my makeover. Voila! I was transformed, temporarily at least. And, of course, I got compliments as my “”teacher”” smiled proudly. Yet something inside me felt superficial. Even when I went to the mall and listened to this potential model explain to me how to wear clothes that looked good on my hips, I felt like a poser. This wasn’t me. I couldn’t understand the beauty of a pair of hand-stitched leather pants even if I tried. Worse, when I tried to bat my lashes at an unsuspecting gentleman, he looked at me with concern rather than desire. Sure, I’m tragically unhip compared to this girl. But maybe it’s “”OK.”” I just hope UCSD students will forgive me. ...

President Clinton's Legacy is a Mixed One, Both Good And Bad

The excitement of Election Day has come and gone and all the controversy surrounding the selection of the next president is finally coming to an end, so slow down and catch your breath. At last, the consequences of what transpired Nov. 7 and in the weeks that followed can be fully analyzed. We now know who won which elections and can debate about what their victories will mean when they take office. On the other hand, I have decided not to do that. I will not write about my thoughts on the new president, the “”chad,”” Florida or any of that. I even promise not to mention the new president’s name. Instead, I will focus on the outgoing president and what transpired over the past eight years and perhaps give you a new take on his presidency. Love him or hate him, William Jefferson Clinton, the 42nd president of the United States, is nearing the end of his second term. Affectionately considered a “”lame duck”” by students of political science, Clinton has spent the last few months of his presidency out of the limelight, not able to do much with the Congress, as he no longer has any bargaining leverage. When the new president is inauguarated, exactly eight years will have passed from the time of Clinton’s inauguration. Looking back, we cannot help but wonder what mark Clinton will leave for America. What single event will a person first remember when the name “”Clinton”” is evoked? In other words, what is Clinton’s legacy for the American people? Most presidents of the 20th century carry a legacy. Franklin Roosevelt led us through World War II and left us with the New Deal. Lyndon Johnson intergrated the country with the Civil Rights Act and the Great Society, but tore it in half with the Vietnam War. The bitter legacy that Richard Nixon left continues to affect how Americans view their government. Ronald Reagan left Americans a legacy of unfulfilled possibilities, which were only realized later in the Clinton administration. That, and a huge debt. George Bush, after the fall of the Berlin Wall and the end of communism, left Americans with a new sense of national pride. What will be Clinton’s legacy? When considering what Clinton has and has not accomplished over the past eight years, an interesting aspect should be pointed out. Clinton seems to epitomize a type of self-juxtaposition, not only in policies but character as well. To determine his legacy, both must be examined. One option that Clinton can claim as his legacy is the tremendously strong economy. Clinton entered the White House while the economy was in a recession. When he leaves office, the country will be coming off one of its largest economic booms in history. The New Economy has made many Americans very weatlhy and has brought about advancements in technology. Is it fair to say that we can thank Clinton for these wonderful times? Unfortunately for him and for Vice President Al Gore, both who continuously remind people that the great boom was of their doing, it is simply not true. Many economists agree that the economic boom started in the last year of Bush’s term, and perhaps even reaches back to Reagan’s substantial tax cut. Hence, Clinton’s legacy cannot and should not be equated to the New Economy. In reality, a president has very little control over how the economy is managed. Even if you believe that no single person can manage the economy, the president has much less influence than most Americans believe he does. He cannot directly raise taxes to slow growth nor cut taxes to stimulate it. Perhaps Clinton’s best move for the economy was the re-appointment of the omniscient Alan Greenspan as chairman of the federal reserve. Greenspan, along with the genius of former Secretary of the Treasury Robert Rubin, kept the economy going strong. Another option for Clinton’s legacy would be foreign affairs. For most of his first term, Clinton seemingly fumbled through many hotspots, failing in Haiti and Somalia. The violence in the Balkans and the United States’ confused stance on it only further proved Clinton’s inadequacies in foreign affairs. Into his second term, however, Clinton made some very important decisions, namely his appointments of Madeleine Albright as secretary of state and the hard-nosed William Cohen as secretary of defense. Thanks to them, Clinton’s foreign affairs record has been tremendous, with successes in helping create peace in Northern Ireland and Bosnia. His greatest foreign affairs accomplishment, believe or not, was forging the peace between Israel and the Palestine before this latest violence. Not since Carter had a president been so involved with the Middle East, forging peace accords between Yassir Arafat and Yitzhak Rabin, and after Rabin’s death, Benjamin Netanyahu. Despite Clinton’s obvious and often criticized attempt to base his legacy on foreign affairs, he accomplished much more with the issues at home. Clinton again started off on the wrong foot with national issues. He forced a tax increase through Congress, which, later on, defeated his highly touted Health Care Reform Bill. This was an embarrassment to the administration, one that spilled into the 1994 congressional elections, resulting in the Republican takeover of Congress. Near the end of his first term and into his second, there was a turnaround. Always the crafty politician, Clinton moved from the left to the center, referring to himself and his followers as “”New Democrats.”” He took the credit for the Welfare Reform Bill of 1996 away from Republicans and pushed a minimum-wage increase in through Congress. Thanks to the economic boom, the massive national deficit dwindled. It is predicted to eventually turn into a trillion-dollar surplus. This surplus, however, is as shaky as a house built from a deck of cards. It is based entirely on the capital gains taxes from the bull market. Where the market goes, this surplus will follow. Lastly, one cannot ignore the complicated nature of Clinton’s character nor his constant battle with scandal. As mentioned before, Clinton seems to epitomize a duality. Here is a man that was on top of everything, the leader of the most powerful nation in the world. Yet he fell in the eyes of Americans and the world with the emergence of the Monica Lewinsky fiasco. It all culminated in 1998 with Clinton’s impeachment by the House and his trial in the Senate. Will this be his legacy, to be forever scarred by scandal and only the second president to be impeached? Looking at all of these events, the economic boom, the peace accords in the Middle East and Bosnia, unprecedented changes, attempts to change, the welfare state, and his impeachment, what can we say will ultimately be Clinton’s legacy? What will American remember of Clinton after he leaves office? My answer: everything. All of these events will be part of Clinton’s legacy for America, the good and the bad. And, though many would argue with me, this is the sign of a good president. Clinton had a hand in almost every arena possible, like Franklin Roosevelt 60 years before him. Everything that Clinton did, all the peace accords and all the scandals, is remembered because of its importance to the economy, the world and to our society. Even the economic boom will be credited, if unfairly, to Clinton. As mentioned before, all of this will culminate to one single aspect: his character. His dueling personalities, coupled with his accomplishments and defeats, reflect this character. He was a great president brought down by his human, carnal desires. This will be Clinton’s legacy: the duality of his accomplishments and defeats that reflected the duality of his nature. As promised, I did not mention the new president’s name. I cannot tell you what will happen in the next four years. What I can tell you is that our new president will be inheriting a nation from a great president, a great leader, and most importantly, a fallible human being. ...

Editorials

The UCSD Guardian is published twice a week at the University of California, San Diego. Contents (c)2000. Views expressed herein represent the majority vote of the editorial board, and are not necessarily those of the UC Board of Regents, the ASUCSD, nor the entire Guardian staff. Vincent Gragnani, Editor in ChiefBill Burger Managing EditorsJeffrey White, Copy EditorTom Vu, Opinion EditorLauren I. Coartney, News EditorRobert Fulton, Sports EditorDavid Pilz, Photo Editor With the announcement of the certified vote in Florida on Sunday afternoon, Texas Gov. George W. Bush has been unofficially named the president-elect of the United States. This, however, has not stopped Vice President Al Gore from contesting the results of the election on several counts in an attempt to have the decision turned in his favor. The Guardian believes that it would be in the best interest of the Democratic Party if Gore conceded the election now and looked toward the 2004 election. Was Gore cheated? Possibly. Did more people vote for him than Bush? Definitely. Does he, by all rights, deserve to be the next president of the United States? Perhaps. Despite all this, it would be better for Gore’s party if he conceded now. First of all, by conceding, Gore would make the Democrats appear that they have the best interests of the presidency and the country in mind. This would plant a seed in the minds of voters that the Democratic Party is attempting to do away with partisan politics, a problem against which the population claims to be rebelling. With this thought entrenched in their minds, the voters would be much more likely to elect a Democratic president in four years. Second, winning this election is not much of a prize anymore. Whoever does win will be labeled a phony or counterfeit president and will likely not be given the support and power that the office normally earns. After four years of a weak Republican president, the nation will likely vote in a Democrat next election, whereas if Gore did happen to win, he would stifle the candidacies of many qualified Democrats in 2004 and almost ensure that the next president is Republican. At this point, it is almost certain that Democrats like Hillary Clinton and Dick Gephardt are holding out for a Gore concession. A Gore loss would also put Joseph Lieberman back in the Congress for six more years as a senator from Connecticut. In a Senate that will most likely be split 50-50, the loss of Lieberman to the vice presidency could be catastrophic for the Senate Democrats by giving the Republicans the slight majority. Gore may have something to gain by contesting the results in Florida, but the Guardian feels he would be better serving his party if he simply conceded the race and cut his losses. This election is obviously a very disappointing one for the Democratic Party, one that they felt they should win because of the strength of the economy and President Bill Clinton’s two-term legacy. Despite how much it will hurt to lose the election, it is better to forfeit it now then to go on contesting it and further alienate the American people. ...

Letters to the Editor

Editor: In light of the recent developments in the Middle East, we strongly feel that this issue ought to be properly addressed. Having said that, the events as addressed in the article “”Peace Vigil Unites Students,”” published on Nov. 16, give an unrealistic impression of peace in the Middle East. We have seen leaders such as Clinton, Barak and Arafat attempt to construct “”peace”” along illusory lines, which sadly but expectedly culminated in the latest Al-Aqsa Intifada. We can talk all we want about peace, but when the core human issues of justice and freedom are ignored, peace loses its meaning. We cannot achieve peace along false lines. More importantly, the line in the article stating, “”The vigil primarily concentrated on Christians, Jews and Muslims”” is questionable. If this was the case, then why was the event organized at the same time that the Muslim Student Association held its Islamic Awareness event, thus making it difficult for the group to attend? Secondly, the Arab Student Union and the American Arab Anti-Discrimination Committee were not contacted. If you ask us, the planning was shady and casts doubt on the intentions of the organizers. Moreover, the quoted words of Rabbi Goldstein, “”It is important to understand that what happens there, happens there, but here is our own world”” take away from the severity of the situation and our moral responsibility as citizens of the United States and human beings on this planet to guard against the suffering of innocent populations. Let us not forget that our own president is at the forefront of the situation. Let us not forget that what happens here greatly affects what happens there, as the actions of the United States since 1948 have determined the developments in the region. In conclusion, when we get together to hope for peace, we cannot turn a blind eye to the source of the conflict, which is largely about occupation, independence, and self-determination, rather than the way it is misrepresented in the mainstream American media. Even if Jerusalem were a desert, devoid of holy places, the conflict would be no less fateful. After all, this is an issue of basic human rights to property and liberty. Wasn’t it the great American Patrick Henry who said “”Is life so dear or peace so sweet as to be purchased at the price of chains and slavery? Forbid it, almighty God, I know not what course others may take, but as for me, give me liberty or give me death.”” — Nour Chammas and Lana Kreidie ...

Taking Political Correctness to the Extreme

With dire determination about the nature of his being, an annoyingly assertive red M&M once said, “”I’m not plain, I am milk chocolate.”” I would like to meet him, give him a big smack upside his candy-coated shell, roll my eyes and tell him, “”Oh puhleese!”” Forgive my insensitivity, but when M&M commercials dramatically echo the complaints of present day stereotypes and respond to them with such creative euphemisms, I have but one question: Has society become too politically correct for its own good? Believe me, I am as much for politically correctness as the next person when it comes to issues regarding race, gender and politics. But do we really need to sugar coat every other word in a conversation just to avoid sounding stereotypical or degrading? Take the case of my high school friend Gina, who had recently found a job at a local nude bar. We had been talking about what she did for a living. “”For the sake of political correctness, I’m a not topless bar stripper,”” she smiled and told me matter of factly. “”I’m a male entertainer, thank you very much!”” I thought for a minute. “”Um, no,”” I said. “”You’re a stripper.”” I simply rolled my eyes and laughed. I sensed she was being politically correct for the sake of her ego. Somewhat disappointed, and embarrassed at the sheer bluntness of flat-out being called a “”stripper,”” she shrugged. “”Stupid, I know,”” she replied. “”I’m a bit overly sensitive about that, huh?”” Stupid? No. Overly sensitive? Maybe. Anal? Unnecessarily. Too politically correct for her own good? Yes. In my opinion, it was fine for Gina to sugar coat her not-so-smiled-upon profession and avoid criticism or speculation. I bet one will agree with me, though, when I say it gets old when people always want to be politically correct about every little thing. It is simply not necessary, because when people use politically correct terms incessantly to avoid sounding shady, they succeed in accomplishing the very opposite. Political correctness was born sometime in the 1980s as a device to curb public figures from speaking without inhibition and offending minor factions in society. Although it started only to hold public figures to a higher standard of professionalism, it soon became an unwritten and written law in most communities. Using political correctness was to prevent people from being offended, to compel everyone to avoid using words or behavior that may upset homosexuals, women, nonwhites, the crippled, the mentally impaired, the fat, the ugly or anything not in the desirable norm. However, political correctness has risen to new heights in people’s consciences when the only thing people are concerned about is the preservation of others’ feelings and egos and the risk of sounding too stereotypical. What people do not realize is that political correctness is an ideal, but should not be mandated by social norms. We should be able to speak as plainly as we want without worrying about how we sound, just so long as we are genuine and do not speak with fighting words. If we are judgmental and critical people to begin with, no politically correct term will be able to mask that. Every word, whether describing a person, a group of people or a profession, seemingly has a politically correct counterpart. Although most of these terms are arguably needed to prevent prevailing stereotypes, it is easy to get carried away. Soon we are left with commercials that mock the behavior of society and animated candy pieces telling us what to call them — something that this vertically challenged, Asian-American, Greek-affiliated, academic athlete (or: short, sorority, nerdy Asian girl) will certainly not have. ...

Untraditional Ways of Celebrating Traditions

“”Aww, I’m sorry,”” was what my friend Joanna said to me when I mentioned to her that I would be spending Thanksgiving weekend by myself here in San Diego instead of going back home to Sacramento. I have done so the past three years and I did it again this year. I usually have lunch or Thanksgiving dinner at a friend’s house and then recluse myself in my cell, affectionately called “”my apartment,”” staring at porn – I mean fantasy hockey statistics – on the Internet. This year is entirely different. My entire four-day weekend was spent by myself, up until my roommates’ return Sunday night. No lunch, no dinner. Was I lonely, like a lone coyote baying at the moon in the desert? Not at all – I thoroughly enjoyed the solitude and peace, a much needed change from my hectic life. I can actually say the weekend was productive. I got the rest I needed, caught up on my readings and just plain relaxed. The second response I got was, “”Don’t you miss your parents?”” Well … no, not entirely. After all, I moved down to San Diego for a reason. All in all, this is perhaps the main reason I do not celebrate Thanksgiving the “”traditional”” way: I have no one to celebrate it with. Please, please, do not say “”aww.”” This is, in a way, of my own machinations. As mentioned earlier, I have friends from San Diego who I could join for dinner, or I could go up to the City of Angels. But I decided to stay on my own. “”But what about the tradition of having Thanksgiving dinner with your friends and loved ones?”” you must be asking yourself. If you know me at all, you would understand that I’m not one for traditions. Thanksgiving (which I have started to refer to as Turkey Day) is a time for me to reflect on things. I admit, during my freshman year, staying alone in the dorms got a little boring. But, as the years progressed, I began to value the time I spent with the apartment all to myself. This holiday gave me the opportunity to sit back for a couple of days and reflect on the year: the good things, the bad things and the friends I hold dear. This year in particular, I was able to reconcile some issues and give thanks to some existential being for all the positive things and people I had the privilege of being friends with. Well, perhaps existential being is not the correct term, seeing how I am not religious and do not believe in any god. This leads me to another holiday I do not celebrate, at least not in the traditional sense: Christmas. Yes, that’s correct, call me a heathen, call me a pagan, call me an infidel; I do not celebrate Christmas. Don’t get me wrong, I’ll be going back home for Christmas break … sorry, I meant winter break. I’ll go back to freezing Sacramento and see my family and friends. And no, I’m not like Charlie Brown: I do not get depressed at the coming of the holidays. While I agree with him that the season has become too market-driven, so has everything else in our society. I would find it odd and rather hypocritical if I did celebrate Christ’s day of birth. I’m not a Christian. I’m not a Catholic. I’m not an anything that celebrates Christmas. And I apologize to my Christian and Catholic friends, but I mean no disrespect; this is simply what I believe. Actually, one can argue that Christmas has, over the centuries, become more of a pagan ritual. Instead of giving gifts in the name of Christ and his glory, many people today give gifts because of some jovial old man named Saint Nick. His chubby cheeks and red-nosed reindeer are plastered everywhere. You name it, he’s probably on it. We’ve raised an icon that sometimes seems to rival that of Christ himself. We’ve become what Matt Damon and Ben Affleck in “”Dogma”” call “”idolaters.”” Man, good thing God has calmed down over the years or there would’ve been another great flood or destruction of Sodom and Gomorrah a long time ago. So why do I not celebrate Christmas, even though it has become less religiously driven? Again, here comes my twisted sense of celebrating tradition. To me, Christmas merely marks the end of a year and the beginning of a new one. Rather than giving gifts, or even receiving them, I prefer to devote my time and energy to expressing to others what they mean to me and to wish them best luck in the new year. I think of it as a transition period from one year to the next. True, this will most likely change when I have a child of my own. She will probably want a Christmas tree and a new hockey stick, since children are generally naive. She’ll probably learn who Santa Claus and the Easter Bunny are before she understands who this guy named Jesus Christ was. Then maybe, once she understands this, she’ll come to the same conclusions I have. ...

ELECTORAL Fuzziness

The never-ending controversy in Florida is enough to make a person even more cynical about today’s politics in the United States, if that is possible. Not since 1887, when Congress created an election law regarding voter rights, has the fate of a new American president hinged for so long on a single state. Sky Frostenson/ Guardian Many Americans are oblivious to how our country’s politics work. One would hope that by now, however, they have at least learned that they do not actually vote for their presidents. Their vote is far more indirect. In accordance to our country’s official form of electing presidents, the Electoral College system, Americans vote for other people to vote for their president. In other words, even if a candidate wins the popular vote of the people, he may still lose the election if he loses the electoral vote. Sound stupid? Well, you are not the only one who thinks so. According to a recent article in Newsweek, many have described it as “”a dinosaur that should be retired to a museum,”” “”an appendage to an anachronism”” or “”a train wreck waiting to happen.”” Many prominent politicians are part of this list of Electoral College critics. For instance, New York Senator-elect Hillary Clinton and California Gov. Gray Davis, two politicians who appear to be the front-runners for the 2004 Democratic presidential nomination, have urged the eradication of the entire system. Throughout America’s history, there have been attempts to reform the Electoral College or abolish it altogether. Today, more than 700 such attempts at reforming the system have been made, most recently in this year’s election. Past presidents Franklin Roosevelt, Richard Nixon and Jimmy Carter have been among the past attempted reformers. Perhaps the biggest drawback of the Electoral College is the fact that a president can be elected even if he does not win the popular vote. Such a characteristic cheapens the notion of a so-called fair democracy, one in which the views of a people are supposedly reflected in the political process, especially one as significant as the election of their president. Ironically, an American president has been elected without winning a majority of the popular vote 15 times. It occurred twice in the case of Bill Clinton; he won only 43 percent and 49 percent of the popular vote in 1992 and 1996, respectively. Put simply, the system is old. In fact, it’s so old that at one time, its primary objective was to give a political edge to slave owners, many of whom, not surprisingly, were members of the Electoral College. The Electoral College was born in a country far different from the one we live in today. Two hundred years ago when the framers created it, they emphatically did not want a president dependent on the legislature. Consequently, they immediately rejected a model reminiscent of England’s parliamentary system which allowed the legislature to pick its own leader as chief executive officer and prime minister. Post-colonial traveling was very difficult, and this made the transfer of information slow and tedious. Adding to this was the fact that no national parties existed, leading the framers to fear that many regional candidates would divide the vote and subsequently skew the election process. Proposals to select presidents by direct national popular election were shot down quickly and deemed impractical for a young nation so large and spread out. Three reasons were given by the framers, according to the testimony of Yale professor Akhil Amar to the House Judiciary Committee in a 1997 hearing on Electoral College reform. First, very few candidates would have truly continental reputations among ordinary citizens. As a result, most Americans would not have enough information to intelligently choose a national figure for president. Second, a president elected by a national popular vote was seen with much suspicion. Many founding fathers felt that a populist presidency was at risk of attracting demagoguery and even dictatorship if one man claimed to be the voice of the American people. Third, the framers believed a national popular election would ruin the delicate balance of power among states. For example, southern voices would count less in a direct national election because slaves were not allowed to vote. In addition, abuses in voting practices could result from a popular election. “”A state could increase its clout by recklessly extending its franchise,”” Amar said. “”For example, if a state let women vote, it could double its weight in a direct national election.”” The framers endorsed an Electoral College system because it allowed a state to get a fixed number of electoral votes no matter how broad or narrow its franchise. Amar claimed the system is painfully outdated. “”I consider the so-called Electoral College a brilliant 18th-century device that cleverly solved a cluster of 18th-century problems,”” Amar said. “”As we approach the 21st century, we confront a different cluster of problems and our constitutional machinery of presidential selection does not look so brilliant.”” None of the reasons the framers had for defending the Electoral College system is relevant today. For one thing, almost all Americans are familiar with the candidates running for the presidency. Even if they are not, today’s advanced communications technology allows all information on a candidate to be virtually a click away. The mass media alone is enough to relay sufficient amounts of information about a candidate, even though it is often biased in nature. With the existence of different political parties, demagoguery or dictatorship are highly unlikely. Finally, the framers’ last argument about abuses in state voting practices is thoroughly obsolete. Today, both African Americans and women are able to vote and are no longer selectively disenfranchised. States do not play as big a part in deciding whether to give the voters a direct voice in choosing electors, nor do they play as significant roles in defining the electorate. The Electoral College suffers from other faults, as well. Over the years, it has tended to over-represent voters in rural states. For instance, in 1988, seven of the least populous jurisdictions in the United States, including the District of Columbia, combined to have the same number of electoral votes as Florida: 21. At the time, however, the combined population of those seven jurisdictions was only one-third the population of Florida. Yet another criticism of this antiquated system is that the electoral votes of each state are awarded solely on a winner-take-all basis, in which case the potential of a third-party or independent candidate to win any electoral votes is pitifully slim. Defenders of the Electoral College system argue that to abolish it would be profoundly dangerous. They say that it would allow presidential candidates to direct their campaign attention solely to those areas with the largest populations. With campaign conduct already under heavy scrutiny, many of them feel the problems would only get worse were the current system revised. My retort to these defenders is really quite simple: Time causes change and America as a nation has followed suit accordingly. All complications which may arise as a result of the abolishment of the Electoral College system should be anticipated and prepared for. After all, everyone knows that any significant social change necessitates an almost equal amount of adjustment. Such is evolution and the nature of transition. Let this year’s mess of an election be proof that America is in dire need of change in its method of selecting its presidents. In times of change, America has followed suit admirably before. I say it’s high time we did so once again. ...