If you’re anything like me, when you watch this, you’ll be forced to marvel at the absolute insanity of the whole thing.
Dr. Robert R. Redfield, MD was a colonel in the US Army, has a distinguished career as a physician, medical researcher, and public health expert. One of his primary areas of experience and expertise is virology and immunology, and he has years of experience studying the treatment of infectious disease in clinical settings. He held a tenured professorship in the medical school of a highly-respected American university, and has served in numerous advisory roles to a wide range of federal government agencies. All that was before Donald Trump appointed him to be the director of the Centers for Disease Control and Prevention and the administrator of the Agency for Toxic Substances and Disease Registry.
In that role, Dr. Redfield recently advised a Senate panel that even with an optimistic vaccine timeline, the general public would not be inoculated until the summer or fall of 2021. He also told them that masks could be a more effective protection against COVID-19 than the vaccine. If it wasn’t clear from that paragraph above about his background, this guy isn’t just a doctor. He’s a respected virologist, a prominent expert on the clinical treatment of infectious disease, and the head of a government agency — chock-full of infectious disease experts — whose whole purpose is to protect public health and safety through the control and prevention of disease. That’s the guy who provided clear, concise, and well-informed answers to a Senate panel on the anticipated availability of a vaccine and on the medical efficacy of face masks.
After Redfield’s statements got considerable media attention, Donald Trump told reporters, “I believe he was confused” and insisted a vaccine could be available in weeks and go “immediately” to the general public. And he claimed (with usual Trump confidence) that a “vaccine is much more effective than the masks.”
I get that Trump’s supporters believe he’s a great leader. I get that they’re prepared to overlook his moral failings, that they don’t see him as a liar because it appears to them that the folks relentlessly accusing him of lying have an obvious axe to grind. Sure… that sort of thinking is evidence of a hyper-partisan commitment to the MAGA movement and as such defies some rationality and logic… I’m just saying I get it.
But what I don’t get — what’s absolutely breathtaking to me — is the idea that anyone believes Donald Trump when he stands in front of reporters and says with a straight face that Dr. Robert R. Redfield, MD “was confused.” Even if you like him, this is a guy who has had trouble pronouncing the word ‘Yosemite’ and who has said (on tape!) that his preference is to present a positive, optimistic outlook in his public statements about the pandemic (and that’s a generous read of his comments to Woodward). That guy, who happens to be running for re-election and therefor has a clear interest in presenting himself successfully leading national efforts to defeat the pandemic, is telling you to believe his own assessment instead of believing the “confused” official statements of the respected physician, virologist, and public health expert (that he appointed!).
I really do mean the word “breathtaking” — it literally takes my breath away when I try to wrap my head around the fact that anyone is inclined to believe Trump when they hear him say these things. Understanding how this guy is still a viable candidate for any public office either requires serious mental gymnastics or is cause for deep depression. (Or it’s both.)
Here’s my takeaway: There can’t be a better illustration of our country’s brokenness than watching John Berman and CNN be accused of unfairness, bias, and partisanship for pointing out that the director of the CDC, Dr. Robert R. Redfield, MD, is a reliable expert on vaccines and public health policy and that Donald J. Trump is not.
From yesterday’s Washington Post: One space between each sentence, they said. Science just proved them wrong.
Some reactions:
As has become commonplace, the headline is a little overzealous. The scientists behind the study probably wouldn’t use such strong language, and the rest of the article is a little more cautious in the language it uses when drawing conclusions from the research.
The researchers used a fixed-width/monospace typeface. To say that misses the point is an understatement. Even most of us one-space zealots admit that two spaces makes sense for monospace type.
One of the study’s authors says it’s still reasonable to infer from this that their results would also apply to proportional type, but her reasoning only makes sense if you don’t understand how fonts work, or the real reason one space makes more sense:
…the point of double-spacing is to make up for how monospace type looks weird and janky.
It’s about aesthetics.
The “benefits” of two spaces after a period were only observed in study participants who… wait for it… are people who usually type two spaces themselves. Maybe they didn’t actually learn anything about typography or font legibility, but rather about people being stuck in their own habits.
Major kudos to the Post article’s author, Avi Selk, and to whoever was responsible for formatting the online version. The piece uses a monospace font and all sorts of crazy spacing tricks to literally show instead of just tell. It’s thoughtful, creative, and very effective.
(And you gotta love the note at the end, which is — ironically enough — a nail in the coffin of the two-space argument: “Note: An earlier version of this story published incorrectly because, seriously, putting two spaces in the headline broke the web code.”)
Sorry, but the “science” doesn’t prove anything here. Lifehacker’s take on this is right on: “No, You Still Shouldn’t Put Two Spaces After a Period.”
A guy shot up a church in Texas today. Same shit, different day. Sigh.
If you’re pissed about the gun control debate (or lack thereof), sick of politicians who pray for the victims but don’t do anything to prevent it from happening again, disgusted by the polarization of public discourse in our country…
Stop posting about it on Facebook. You’re making it worse. We’re making it worse.
Because here’s the thing (and you know this): You’re typing into an echo chamber. No one who matters can hear you. Your heartfelt rant, your clever-yet-sad statements about politicians’ inability to act, or your tearful pleas about the tragedy of AR-15s… the only people reading them are people who already agree with you.
And it’s probably worse than that. The more we all post this kind of stuff, the better Facebook’s algorithm gets at making sure we don’t interact with anyone who disagrees. Every time we do this, we widen the chasm between red and blue, and we help foment the very things that are destroying America.
And while it’s easy to blame Facebook, let’s be honest with ourselves. We’re the ones killing our democracy. Because we eat this stuff up. We love having our own feelings validated, our opinions affirmed, our worldviews confirmed to be correct.
But if you want this madness to stop, if you want to actually do something about the evil madman in the White House, if you care about gun legislation and women’s right to choose about their own bodies and an economy that doesn’t just serve the rich and people’s right to marry whomever they love… don’t post about it on Facebook. When we do, we’re not just wasting our breath. We’re making it all worse by digging ourselves deeper into our trenches. We’re giving Trump and the Russians and Fox News and InfoWars fertile ground to sow mistrust and disunity and polarization.
Instead, go out and talk to someone who doesn’t share your views. Write checks to candidates in contested districts, or volunteer your time to make phone calls for them. Go to a gun store and learn something about these things you want to ban but that so much of this country can’t stop buying. Run for office. Just whatever you do… stop fueling the echo chamber.
(As for me, from now on Facebook is for snarky comments about sports, adorable pictures of my kids, and giving tech advice to friends. No more politics. Because I can’t trust Facebook’s algorithms not to screw up our country even more and I refuse to be a part of it.)
I used to go to a lot of movies, mostly because I really like movie theatre popcorn. Since kids, I get to two or three movies each year, at least in the theatre. I’m not sure what the last movie I saw in the theatre was, but it might have been Star Wars: The Force Awakens.
Though I loved that movie, I also felt like they spread on the just-for-nostalgia pieces a little too thick, and that there were entire scenes, and characters, and even aspects of the narrative that seemed to serve no larger purpose other than to evoke fond memories of a time when Star Wars existed but Jar Jar Binks did not.
And in that case, I didn’t mind the nostalgia that much for three reasons: First, it had been awhile since we had seen the Millenium Falcon, Han, Leia, Chewy, and Luke. We wanted to see them and feel reassured that the franchise was back on track after the disaster of eps. 1–3. Second, The Force Awakens was the first of a trilogy, and as such I’m ok that it gave in to that nostalgic indulgence because it felt like it was establishing itself — a new trilogy from a new director telling a new story — laying the groundwork so that the next films don’t have to be so overtly referential/deferential. Third, the Star Wars films (by which I mean, eps. 4–6) were never particularly subtle or mysterious in their allusions to each other, references to classical motifs, and willingness to use cheese and even camp. So I felt like Abrams’ over-indulgence in nostalgia was forgivable from a film that was so much about establishing the connection to the earlier series.
I almost felt the same way about Rogue One: A Star Wars Story. In fact, I felt even better about it — that the narrative and the filmmaking were referential and deferential while at the same time making much of the opportunity presented by being just a little outside of the main, (dare-I-say) sacred core storyline.
And then the last shot happened.
(If you haven’t seen it and you don’t want me to ruin the worst, most craptacular surprise ever… stop reading now.)
My problem is the cartoonish CGI/live-action mashup of a certain beloved character, and the fact it was visually dissonant (not to mention creepy-looking).
But my problem is also that it ruined the tone of an otherwise solid third act. The narrative arc had done its job: having sat on the edge of their seats for the entirety of the final battle, the audience watches in delight and horror as a cavalcade of heroes overcome an almost-impossible series of circumstances in order to set in motion the well-established heroism of Episode 4 (and beyond), only to face the inevitably of death with the certainty that the sacrifice was worth it. That final act — right up to and including the anonymous heroism of Vader’s light saber victims passing the thumb drive (that’s what it was, right?) along to its obvious eventual recipient — draws on the operatic essence of Lucas’ originals, while at the same time inhabiting a darker, more realist sensibility in which even honorable death is horrible and sad. The film was succeeding. We were there. I hadn’t closed my mouth or sat back in my chair in 20 or 30 or however many minutes it was.
And here’s the thing: we all know what’s going to happen with those Death Star plans. We don’t need reminding. And in case we do (even though we don’t), Jimmy Smits announces that he’ll send someone he trusts on a mission to find a certain desert-dwelling Jedi in hiding — a female someone based on his pronoun choice — and since pretty much the only thing (relevant) we know about his character is that he’s Leia’s adopted father, we know exactly who someone is.
At that point, with the Death Star plans on a ship that we’ve seen before, about to be in the hands of the person we know will put them into a certain short and beepy droid, the film could have ended. We’d seen what we needed to see. Of course, overstating the obvious is a hallmark of this franchise, so the story had to go one scene further.
They still could have pulled this off without ruining the movie. They could have shown the back of a female character garbed in white, and we could have even heard her voice. The camera didn’t need to show her face. We didn’t even need entirely original dialogue — our about-to-be heroine could have been preparing (or even beginning to record) her message to Obi Wan, taking us right up to the scene where we met her in 1977 (or whenever we were old enough to meet her for the first time).
But the scene we got instead was troubling on multiple levels.
Most obviously, the visual effect didn’t work. No matter how well Disney’s digital artists can pull off their CGI magic — and they deserve credit for all the ways they succeeded in this movie — there was no way we were going to feel good about an animated version of a character who we first meet moments later (timeline-wise) as a flesh-and-blood actress rendered on actual film. I won’t belabor this point as it has been well-trodden by reviewers, and because I suppose the degree to which the (semi-)animated character was (not) effective could be a matter of opinion. I mean, I supposed someone could have felt like it didn’t look horrendous. ((Of course, no one sane would think that. But I imagine it’s possible that some aesthetically misguided and/or very mentally-ill person could.))
Second, I take issue with Leia being played by anyone other than Carrie Fisher. Sure… if they ever want to show her at an earlier stage in life then I’m ok with a young actress playing the part — people change over time and I could live with the idea that a well-cast young Leia grows up to become Carrie Fisher’s portrayal. But Rogue One’s Leia is not a younger Leia. This is the Princess Leia Organa of Alderaan we know, traveling on the very ship and wearing the very dress she was wearing when we first met her, and on the very mission where she hides the Death Star plans on R2 and stands up to (our first encounter with) Darth Vader. That Leia will always be played by Carrie Fisher, and I cannot accept any other portrayal of her, even one animated to bear an uncanny resemblance to Her Highness, daughter of Anakin Skywalker and future general of the Resistance (and future wearer of metallic swimwear).
Third, cartoon Leia’s dialogue is unforgivable. The film is, up to that point, about sacrifice. And in forcing us to watch its protagonists’ deaths one after another, it puts a fine point on it. Truly tyrannical evil cannot be defeated by self-interested individuals (like the cowardly rebel leaders who initially balk at the idea of going after the Death Star plans). Rather, the Dark Side’s vulnerability is only exposed — literally in this case — when people come together, setting aside their individual needs (up to and including their individial need for survival) in the interest of the greater good. Those heroes, we come to understand, may die as martyrs, but despite their demise as individuals, they collectively live forever in the legacy they share. This film, then, justifies its own existence as a documentation of that heroism, bearing witness to the actions of the previously-nameless souls who perished so that Luke, Han, Leia, and Chewy can save the day and get the credit.
So when cartoon Leia shows up and announces that the point of the whole thing is “hope,” it does a disservice to the immediately preceding narrative. I suppose one could say that hope is the point, and that Jyn’s pre-battle pep talk on that topic is a statement of the movie’s central message. But other than those two mentions, this film isn’t hopeful — though they successfully get the Death Star blueprints to the Rebels, our heroes all die — because this is the story of Rogue One’s ill-fated martyrs, not the story of Princess Leia, her secret twin, and their estranged biological father.
Indeed, Leia’s story (or, the one in which she is a principle character) is very much about “a new hope.” And its fine to connect the final moments of Rogue One to the opening scenes of Episode 4. But we didn’t need her pithy line (or her cartoon face) to draw the connection — it had been drawn already when we saw the guys in the familiar helmets on the familiar ship, and again when we saw her dress from behind.
Leia’s face and her stupid line ruin an otherwise great Star Wars film. It’s fun to watch, well-paced, well-enough acted. It is composed in virtually every way as worthy of the Star Wars franchise. That’s most evident in the attention paid to the operatic score, the artistry of the sets and establishing shots of planetart landscapes, the sound effects of the battle scenes, and all those tiny details George Lucas trained us to notice (blue milk, pilot call signs, particulars of Rebel and Imperial ships, etc.). I even didn’t mind the CGI version of a character previously played by a now-deceased actor, or the unnecessary (and maybe poorly-timed) comic-relief cameo from a certain pair of familiar droids. Neither diverted from the narrative’s established direction, and the former example was less visually problematic than cartoon Leia because that character always had a certain dark cartoonish quality to him. ((The “uncanny valley” hypothesis — pointed out in Kelly Lawler’s critique in USA Today and Noah Berlatsky’s on qz.com — is right on with its suggestion that “human replicas that appear almost, but not exactly, like real human beings elicit feelings of eeriness and revulsion.” It works for the Grand Moff Tarkin character because he is eery and revolting.)) So his now-computerized presence felt less arresting in comparison to Leia, whose familiar softness and “realness” felt missing from the abomination we found in her place.
Point is: great movie until the last 10 seconds. And maybe a great movie despite them. I continue to be excited for what’s in store next, both from the films that will open with scrolling text and from this series of tangential “stories.”
Gruber posted on Monday about how the non-traditional TV “networks” are whooping the likes of ABC, CBS, NBC, and Fox, at least when it comes to broadcasting high-quality, award-worthy content.
One can reasonably argue that the broadcast networks have always produced mostly garbage, but the real change is that the broadcast networks have completely missed the boat on the megamovie revolution — shows that “take television seriously as a medium”. That’s obviously true for dramas like Game of Thrones and Westworld, but I think it’s true for comedies, too. Consider the elimination of the laugh track.
He’s not wrong, except with the implication that the broadcast networks ever had a chance not to “miss the boat.” I’m not a TV-industry guy, and my understanding of that economic world is limited to having lived in Los Angeles for 30+ years, but it seems to me like the big three/four can’t be expected to compete with “networks” that play by different rules.
The broadcasting paradigm is based on a single foundation: revenue is dependent on ratings. That’s because revenue comes from ad revenue, and broadcasters’ ad rates are dependent on their ratings. The trick of the broadcast network’s economic model is to charge advertisers more for ad time than they spent to generate the ratings required to charge the advertisers those rates.
That model results in decision-makers who are risk-averse. Why take a chance on a new product that has a decent chance of failure — even when it’s a new kind of content, or even a little change like laugh track elimination, that you actually believe in — when you have a sure-thing that’ll be good enough?
Furthermore, the broadcast networks’ revenue model tends to reward popular taste (and its cousin, low-brow proclivity) over critical quality. How many episodes of CSI and its seventeen spinoffs did CBS air? (I’ll tell you: way too many.) Were any of the CSI franchises ever considered by anyone to be high-quality drama worthy of critical acclaim? Nope. But they kicked ass in the ratings for a long time, so they made CBS a huge amount of money.
And that’s why networks only care about Golden Globes and Emmies if winning them generates buzz, higher ratings, and (therefore) higher ad revenue. (And maybe because actors/directors/producers like winning awards, and happy actors/director/producers are theoretically good for networks, at least to a point.)
But that formula isn’t a guarantee. Plenty of critically acclaimed shows have been ratings duds. If NBC or ABC or CBS has a choice between an extra point-and-a-half in a key demographic and ten Golden Globe nominations, they’ll always pick the former. And that’s why they air the content they air, and they’ve conceded the trophies to Netflix, Amazon, and the cable networks that can (or at least hope they can) make money on high-brow.
None of this is news. This was the case back when HBO was raking in award hardware for The Sopranos. At the time, plenty of people let themselves believe that HBO was at an advantage because content creators could depict violence, nipples, and curse words. But their real advantage was always that they could afford to take a risk on serious episodic drama, which had the potential for a massively lucrative pay-off for them in the long term (in the form of subscribers who were hooked). The networks play a short game in which last night’s ratings matter right now. While this isn’t universal, and (I’ve been told by friends in the industry) it often isn’t quite so simple, this paradigm is still at the core of how the broadcast networks operate.
Gruber might be able to relate. He loves to tease the Android makers (and more so ignorant Wall Street folks) who go on and on about market share, and who bash Apple’s low performance in that particular metric. He’s observed all along (before anyone else really noticed) that Apple’s eyes aren’t on how many handsets or laptops they ship, but on how much money they make on the ones they do. (Because you can sell lots of phones if you don’t charge much for them. But that also means you won’t make much money.) So Apple is happy not to race to the bottom of the profit barrel in search of market share, because that market share doesn’t make them enough — or any — money.
Jon, here’s the thing: the networks may have missed the boat on the latest and greatest trends in TV. But their execs don’t care, because their eyes aren’t on Golden Globe statuettes, but on how much money they’re making for their networks. And spending a ton of money to make a show that isn’t a definite weekly ratings winner isn’t a smart play for them.
(For the record: I hate this economic model because I like quality programming. And most of my favorite shows right now are on Netflix or Amazon. I’m just saying that it the network broadcasters “missed the boat,” they did so because they made a conscious decision to stay on dry land. Is that cowardly of them? Sure. Does it result in bland, boring network television? Generally, yes.)
Tablet’s editorial board says AIPAC fails to represent both the left and the right when it comes to advocating for Israel on American Jews’ behalf.
The invitation to Trump is a symbol of what AIPAC has become — an organization staffed by mid-level incompetents who disgrace our community with their evident lack of both political savvy and moral sense. Let’s be frank: Some of us would be comfortable with a bunch of back-alley political knife-fighters whose only cause is the active defense of the Jewish people, while others want leaders devoted to making sure that our communal goals embody universal morals and social-justice values—regardless of how this might play on the geopolitical chessboard. Whichever camp you find yourself in, one thing is clear: What we have now in AIPAC is an organization with the failings of both, and the virtues of neither.
Headless Community in Bottomless Spiral
This is a fascinating piece of political rhetoric. The Tablet editors are saying that both sides can agree AIPAC is a poor representative of the American Jewish community, and then make their case from each side.
If they are able to step away from the partisanship and actually offer cogent analytical insight into AIPAC’s failings on both the left and the right, then that’s admirable and useful. But the problem is that virtually no one (at least no one who is actively engaged in/with the Jewish community) is able to actually back away from the fracas and say anything that isn’t seen by one side or both as an unfair attack. In other words, I’m wondering if Tablet’s editorial team falls into the very trap into which they accuse AIPAC of falling: trying to be a voice for all sides and ending up being a voice for none.
Nonetheless, as an attempt to be analytical of AIPAC without staking ground (or, being transparent about your ideology but attempt to transcend it for the purpose of analysis), I think it’s a good try, and a thoughtful, intellectually deft, and interesting one at that.
But…
At the same time, despite some strong language attacking AIPAC leadership (which we’ll get to in a second), the authors seem to be dancing around the point they really want to make: this is entirely about the organization’s leadership, or lack thereof. I think that’s a fair point to make, especially if you can support it with a well-reasoned argument. But a problem with the Tablet editorial is that its authors hint at having a well-reasoned argument to back up their claims, but it’s hard to believe them when (a) they don’t present much evidence of organizational chaos to support their claims ((By “evidence,” I mean thoughtfully-presented factual information that supports their claims, not, “AIPAC failed to stop the Iran deal… Can’t those screwups do anything right?”)), and when (b) they take numerous cheap shots and engage in petty ad hominem attacks ((Exhibit A: “…an organization staffed by mid-level incompetents who disgrace our community with their evident lack of both political savvy and moral sense.”)) on AIPAC leaders.
It should be fair game to claim that specific people lack political savvy or that they have exhibited behavior that calls their moral sense into question, especially if you support those claims in a manner that’s convincing or at least intellectually honest. But calling unnamed AIPAC employees “mid-level incompetents who disgrace” the community that they’ve dedicated themselves (with presumably best intentions) to serving? That statement Trump-esque diss, a petty and rhetorically lazy turn of phrase that must have felt cathartic and wonderfully naughty to type into the essay’s first draft, says more about its author than its subject. It undermines the editorial board’s entire point (as do the other cheap shots sprinkled throughout), and it should have been excised before an editor clicked “Publish.”
And also, it’s mean. I believe in the important practice of a publication’s editorial board writing with one voice, especially on important issues like this. But it comes off looking like cowardly bullying when an unnamed writer (writing on behalf of a seemingly faceless editorial team) attacks a group of individuals without naming names but with a nod and a wink that says, “We’re way too classy to name names but you know who we’re talking about, right?”
With all due deference to the folks behind the publication (for whom I hold an immense amount of respect and awe-filled admiration), Tablet’s typically erudite editors should be above that kind of shoddy writing, and as a publication that endeavors to elevate public discourse (instead of contributing to the absence of discourse down in the gutter on social media), it should be Tablet’s policy to steer clear of lashon hara.
Moreover, if the point is that the root of the problem AIPAC’s staff, then the natural solution is that the membership (who the editorial claims to stand with/for/behind) should act to replace said “incompetent” staff, since it’s incumbent on a non-profit’s employees to advance the mission articulated by the organization’s membership. Of course, the editorial’s stance seems to be that the problem is with AIPAC on the whole, so the suggestion that the organization is fundamentally broken makes sense. But in that case the shots at staff are both irrelevant and misplaced, since it’s the membership who made/let it happen (and if AIPAC is broken on a fundamental level, the problems surely run deeper than some “mid-level incompetents”).
If, however, the organization’s members and mission are still worthy of support, then the solution is an easy one: Get rid of the staff who don’t get it and hire people who do. Otherwise, Tablet ought to be blaming the thousands of people who donate to AIPAC, show up at AIPAC events, and partner with AIPAC in their own communities.
main photo credit: Photo Cindy (Flickr)
Jeb.
Today was the GOP primary in South Carolina. Jeb Bush just dropped out of the race because he failed to receive the support of primary voters in three states whose delegates — combined! — make up 3.5% (19/538) of the electoral college.
(In other words, these states are basically irrelevant in the national election, yet somehow someone gave their most extreme voters — the ones who show up for the primaries — the power to sink a viable candidate’s chances of getting the nomination in favor of a guy who is demonstrably loony toons.)
I’m by no means a fan of Jeb Bush, and a part of me wonders if it helps Dems’ chances in November if the Republicans end up letting extremist voters in small states nominate an openly racist candidate to the party’s ticket. But seriously… if this isn’t enough to give some legs to efforts to change the primary system, I don’t know what will.
Also, wondering: After the way Trump took every opportunity to publicly badmouth, embarrass, shame, and vilify him and his family, if Trump ends up being the candidate will Jeb even cast a ballot in November?
From “Don’t Send Your Kid to the Ivy League The nation’s top colleges are turning our kids into zombies”:
What an indictment of the Ivy League and its peers: that colleges four levels down on the academic totem pole, enrolling students whose SAT scores are hundreds of points lower than theirs, deliver a better education, in the highest sense of the word.
I can’t speak for the Ivy Leagues, but my fourth-tier liberal arts college did a pretty good job.
We’re really, really fucking this up.
But we can fix it, I swear. We just have to start telling each other the truth. Not the doublespeak bullshit of regulators and lobbyists, but the actual truth. Once we have the truth, we have the power — the power to demand better not only from our government, but from the companies that serve us as well. “This is a political fight,” says Craig Aaron, president of the advocacy group Free Press. “When the internet speaks with a unified voice politicians rip their hair out.”
We can do it. Let’s start.
Awesome way to start the new year. Shanah Tovah. (Taken with Instagram at 7‑Eleven)
No one can deny the popularity of the Farmer John pork-laden Dodger Dog, or its all-beef, but still non-kosher, alternative. A report from the National Hot Dog & Sausage Council, a project of the American Meat Institute, which provides data, research and recipes to food manufacturers and reporters, states that the Dodger Dog was the No. 1 best-selling Major League Baseball ballpark hot dog in 2011, and it is expected to be the fourth-highest-selling this year.
For the past several days, there’s been a lot of chatter on the interwebs about a suggestion (which seems to have really taken off with this HuffPost article by Rabbi Jason Miller) that people boycott put pressure on Delta because “Delta will add Saudi Arabian Airlines to its SkyTeam Alliance of partnering companies and would require Delta to ban Jews and holders of Israeli passports from boarding flights to Saudi Arabia.” My colleagues on UPGRD.com, Matthew and Hunter, have offered thoughtful and thorough responses, as have podcast contributors Ben and Gary. Normally, I’d stay out of this to avoid the redundancy. But since I’m in the unique position of being an occasional UPGRD contributor and also someone who works professionally in the Jewish community, I felt like I should jump in. Below is the second of two posts on the topic, both of which are cross-posted on my UPGRD.com blog and on my personal blog.
This is amazing.
Hot Dogs
by Christopher Walken
Do you enjoy eating hot dogs? I hope you won’t be put off by my frankness when I tell you that I absolutely love them. In fact, I enjoy no food item more than a freshly-boiled hot dog. Now, I’ve done a lot of movies, and it’s true that I’ve worked with quite a few celebrities who did not share this opinion. I’m sorry to say that these people have always angered me.
There are two types of people in this world: those who eat hot dogs whenever it is possible to do so, and those who opt to do other things with their free time. Who do the latter think they are kidding? What pastime could be more rewarding than the consumption of hot dogs? I haven’t yet found one, and I don’t expect to in my lifetime. Unlike other foods, hot dogs can be eaten at any time, in any place, and it is not necessary to cook them. Now, I ask you: Why not eat hot dogs? They are delicious.
I carry a bag of hot dogs with me wherever I go. I eat them from the bag whenever I get the urge, regardless of the circumstances. When I make a movie, my hot dogs are my co-stars. If, in the middle of a scene, I decide I want to consume a hot dog, I do so. I waste the director’s time and thousands of dollars in film stock, but in the end, it is all worth it, because I enjoy eating hot dogs more than I enjoy acting. This bothers some people. I was supposed to portray Batman, but when Tim Burton learned of my hot dog cravings, he asked Michael Keaton to wear the cape. To this day, I am peeved about this.
When we filmed The Dead Zone, I ate over 800 hot dogs a day. It was necessary. My character needed to come across as intense as possible, and I found the inspiration for that intensity in my intense love for hot dogs. The director, David Cronenberg, said that he would never work with me again. I kept eating hot dogs when the cameras were rolling, and that seemed to bother him. I say fuck him. He doesn’t even like hot dogs.
I would like to end by emphasizing once again that I really like to eat hot dogs. If any of you people disagree, I loathe you. I despise you. Not only that, but I also despise all your loved ones. I want to see them torn to pieces by wild dogs. If I ever meet you in person, I’ll smash your brains in with a fucking bat. Then we’ll see who doesn’t like hot dogs.
Next week: My thoughts on Woody Allen, hot dog hater and shitty director.
Source: The Onion, sometime in the late ’90s, predating their current web archive.