Afghan interpreter for American military appears on Ellen Show

For five years, Naqeeb served alongside U.S. forces in Afghanistan. But like thousands of other interpreters who have helped Americans work with locals in war zones, he and his family were marked for death. Naqeeb hid from the Taliban under self-imposed house arrest for two years while he awaited a visa that would allow him to relocate to an American city. With Taliban forces tightening their noose on his home town, Naqeeb began thinking of suicide.

Today, thanks to former U.S. Army officer Matt Zeller and his colleagues at No One Left Behind, a charity that has helped over 1,300 Afghans and Iraqis resettle in the United States, Naqeeb is alive and well. He thanks Zeller for helping him start a new life in America, and another, more familiar face for saving his life altogether when he was ready to end it:

More than 35,000 of Naqeeb’s fellow veteran interpreters remain in the crosshairs, struggling to navigate a complex immigration bureaucracy that makes it nearly impossible for them to gain asylum here even after they have risked their lives in support of American troops.

No One Left Behind helps these people and their families get to the U.S., and secure jobs, housing, language classes, and financial assistance. Zeller started the group after his own interpreter, Janis Shinwari, saved his life during a firefight in Afghanistan. When Shinwari found himself facing death threats from the Taliban years later, Zeller managed to pull strings and raise enough money to help his family resettle. He has worked tirelessly on behalf of interpreters and their families ever since.

See more on No One Left Behind in People Magazine’s American Heroes issue, here.

Keeping the candidates honest

After weeks at the top of the polls in spite of MegynKellybizarre comments insulting veterans, Mexicans, women, and just about everyone else, it’s no coincidence that Donald Trump faltered after last week’s Republican presidential debate on Fox News.

It’s long since become cliché to blame the media for our angry and polarized politics. But in the run-up to the 2016 presidential elections, media outlets with political associations are publicly dispatching blowhard candidates much as brokered party conventions did behind closed doors in the years before primaries.

When Trump fired back at Fox moderator Megyn Kelly with some particularly misogynist fluff after she asked him hard questions, he found himself disinvited to the annual RedState Gathering, a conference popular with the conservative base and a vetting ground for Republican candidates. Worse, his charge that the media was treating him unfairly didn’t stick — precisely because Fox News has the conservative chops to go after Republicans and look good doing it. If MSNBC’s Rachel Maddow had moderated the first Republican debate instead of Kelly, Trump might still be the man of the hour.

Fox is the most trusted news source among conservatives, and its impact in the first debate makes a good case for having primary debates moderated by the stations and newscasters most closely associated with the candidates’ parties.

Even more important than Trump’s exchange with Kelly was that the moderators challenged Trump and the other candidates on issues of real, substantive policy — like Trump’s past support for a single-payer health care system. Liberal commentators couldn’t have done it without exposing themselves to charges of left-wing bias, but Fox newscasters have enough conservative credibility to keep candidates true to their prior convictions.

Whether we like it or not, the era of trusted, objective media referees like Ed Murrow and Walter Cronkite is over (to whatever extent it really existed). For now, at least, our media has returned to a long American tradition of meddling in political affairs, from Harper’s Weekly and the downfall of Boss Tweed to William Randolph Hearst and the Spanish-American war.

Our own Constitution, of course, was shaped by an intense partisan debate that played out in the press, courtesy of the newspapers like The Independent Journal and the Daily Advertiser, which published the Federalist Papers.

Could we even fathom the New York Times giving President Obama a biweekly op/ed column during the health care debate? Back then, the Times wouldn’t have pretended to be objective and unbiased — it would have proudly staked a claim as the liberal outlet of the day, and tried to shape the debate accordingly.

As James Madison famously wrote in the Federalist Papers, “ambition must be made to counteract ambition.” The Founding Fathers understood that in a society as big, diverse, and noisy as ours, a multitude of partisan voices could prevent any one faction from dominating, and keep all more honest.

Reading only news with which we already agree isn’t always good for democracy, but even for the most cloistered of us, it ends as soon as we leave the house. Out in the world, almost everyone encounters people of different opinions — and even in our own families, we’ve all got that one uncle.

If the goal of the primary process is to produce the best each side has to offer, media outlets with political flavor can help voters make more informed choices about the candidates running, by giving them access to sources of opinion and criticism that they can trust. And as the presidential election gets underway, the business and political interests that motivate media outlets will ensure that the candidates are more thoroughly vetted than ever before.

It’s ugly, but it works.

America’s Organ Transplant Law Is Criminally Unfair to Donors

Steve Lessin - IMG_6090 - Red Channel CorrectedFrom his desk at the Department of Defense, Steve Lessin spent his life serving his country, developing radar systems to help save American servicemen’s lives. But when he needed an organ transplant to save his own life, the law denied it to him.

Lessin was diabetic but otherwise healthy and active—he was an avid skier and rock climber—until he developed kidney disease in his forties, landing him on the U.S. organ transplant waiting list. He was initially a good candidate for a new kidney, and in 1998, a close friend donated one to save his life, but it only lasted six years. At that point, he had no choice but dialysis, and after four years on it, he was too weak to undergo surgery.

Lessin’s fate was sealed 30 years ago this past Sunday, when the United States enacted a well-intentioned law that effectively condemned him to death.

As a result of the National Organ Transplant Act, more Americans have lost their lives waiting for an organ than died in world wars I and II, Korea, Vietnam, Afghanistan, and Iraq combined. The law bans almost any non-medical payment to living organ donors, whether by the government, health insurance companies, or charities. Recipients themselves can reimburse donors’ travel, lodging, and lost wages, which helps—but only when the recipients have the means and will to do so.

The solution is not to create a market in organs, but to help living donors meet the considerable expenses they incur in saving others’ lives. Giving an organ costs an average of $5,000, but as the Journal of the American Society of Nephrology notes, can be as much as $20,000. According to the U.S. Census Bureau, 20 percent of American households have no discretionary funds at all, and only 8 percent can afford to spend $5,000 donating organs without dipping into their savings or going into debt.

In the coming weeks, the American Society of Transplant Surgeons, the American Renal Society, and the American Society for Transplantation will all release white papers arguing for studies on compensated organ donation. While there is no harm in studying the creation of incentives, the first step should be to get rid of the financial disincentives that currently keep thousands of living donors from being able to donate. One sensible proposal would be creating a debit card-based system not unlike the one the government employs for the victims of natural disasters, enabling government programs, private medical charities, and other people to cover donors’ expenses as they occur.

The situation could scarcely be more dire. In 1983, on the eve of the Transplant Act, there were 10,000 Americans waiting for organs. Today, the number is over 120,000, of whom 100,000 need kidneys. An additional 300,000 Americans are on dialysis, many of whom, like Lessin, might once have benefited from a transplant but are now too sick to qualify.

When we visited Lessin in Arlington, Virginia, in early 2009, his modest apartment smelled faintly of urine, and medical supplies were everywhere. Dialysis bags drained in the bathtub and 30-gallon trash cans sat full of tubing and other medical waste. A small second bedroom had become a warehouse of dextrose solution, saline, dialysis filters, tubing, clamps, sterile gauze and gloves, and more, all in boxes stacked five feet high.

Lessin’s supplies cost at least $50,000 per year. He had 24 feet of dialysis tubing, which allowed him freedom of movement—he preferred the privacy and flexibility of dialyzing at home. “I’m one of the lucky ones,” Lessin said. “I can get everywhere in my apartment except the front door. I have to interrupt my treatment or just not answer the door if someone comes by while I’m dialyzing. But that is a hell of a lot better than having to sit still for hours on end.”

Over time, all forms of dialysis weaken a patient’s heart, reducing circulation, and causing calcium buildups in the extremities. Lessin had undergone several surgeries, including an angioplasty that had saved the lower part of his leg. But by the time we met with him, he had lost several toes, and others were becoming necrotic, blackening and dying.

Tens of thousands of Americans like Lessin die every year, while the barriers the Transplant Act places before living donors live on.

The law resulted 30 years ago from righteous revulsion at a proposal by Dr. Barry Jacobs that the government pay people to come to the U.S. to donate their kidneys. Jacobs wanted to start his own business marketing organs, and figured that the government could spend relatively little compensating these donors—maybe only $1,000 each—and then send them on their way.

Initially, Congress had simply been considering a low-profile law implementing a nationwide network to distribute cadaver organs, but when Jacobs suggested his idea in official testimony, it immediately got representatives’ attention. The prospect that the U.S. would ship thousands of impoverished people here from developing nations so that rich Americans could harvest their kidneys read like the plot of a science fiction novel.

Dr. Paul Terasaki, president of one of the three main American transplant societies, testified before Congress in 1983 that physicians “strongly condemn the recent scheme for commercial purchase of organs from living donors,” and that it is a “completely morally and ethically irresponsible proposal.” Congress reacted by adding a provision that banned all but very specific types of payment in relation to organ donation, and the rest is history.

Today, both Democrats and Republicans rightly fear that reforming the law too dramatically could encourage poor and at-risk populations to sell their organs in exchange for quick cash. But by banning almost all compensation to living donors, Congress has ensured that only the wealthiest and their friends can afford the costs of donating, while hundreds of thousands of other Americans suffer and die.

In his forties, Lessin was young for an end-stage renal disease patient. The average age of an American dialysis patient is 60, and most people get there with Type Two diabetes or hypertension. But whatever the cause, as Baby Boomers age and the obesity epidemic worsens, the kidney shortage will only get worse.

Today, even if every American donated his organs at death, it wouldn’t make enough of a difference; a 2003 study in the New England Journal of Medicine found that only about 1 percent of people die under conditions that allow them to do so. The rest are simply too sick, too injured, or too far from a hospital when they die.

“I thought about buying a kidney on the black market,” Lessin said, “but I was worried my health insurance wouldn’t cover the transplant or my recovery if it was illegal to pay someone for a kidney abroad. I was worried that foreign facilities wouldn’t be safe, and I didn’t want to risk losing my job or landing in jail.”

In July 2009, a few days after we last spoke to him, in a sleepy suburban development almost within sight of the U.S. Capitol, Steve Lessin paid for his loyalty to the law with his life.

Why should donating an organ cost so much?

060426-N-5174T-001(CNN) — Thirty years ago, the United States enacted a law that has inadvertently condemned hundreds of thousands of Americans to death. As a result of the National Organ Transplant Act, more Americans have lost their lives waiting for an organ than died in both World Wars, Korea, Vietnam, Afghanistan and Iraq put together.

The law bans almost any payment to living organ donors. Recipients themselves can reimburse donors’ travel, lodging, and lost wages, which helps, when they have the money — but not when they don’t.

Most people who are living donors give to a family member or friend, but the financial hardships are considerable and people are regularly denied permission to donate because they don’t have enough financial resources.

I was denied for that reason when I wanted to donate an organ to save a friend’s life. With four kids in college… [READ THE REST HERE]

Don’t panic

Panic1873Before the Great Recession, there was the Great Depression. But before the Great Depression, there was the first Great Depression — the Panic of 1873. Like the 2008 crisis, the Panic featured all the high drama of a construction bubble, the collapse of a major industry, and European currency policy. (Unlike the 2008 crisis, the leading figures in the Panic sported fairly prominent facial hair.)

On September 18th, 1873, the prominent American bank Jay Cooke & Company filed for bankruptcy, the proximate cause for what, at 65 months, remains the longest contraction in US history. Cooke & Company had been major underwriter to the Union effort during the Civil War. Two new technologies made this hugely scalable: the advent of paper money backed simply by the US government (rather than gold or silver), and the telegraph. Both were further abstractions of what used to be a very simple financial transaction: trading stuff for currency, in real life. Believe it or not, the United States had no federal paper currency until 1862.

After the war, banks had poured money into the growing railroad industry. Cooke & Company’s chief investment was the planned Northern Pacific Railway, to be the second transcontinental railroad, which they funded through bonds. But the Coinage Act of 1873, passed in February of that year, started the retirement of silver as a currency and shrank the nation’s supply of capital considerably, which raised interest rates. Almost overnight, Cooke & Company found itself with no one to finance its railroad debts, now even more expensive thanks to the higher rates, and declared itself broke.

Other investment houses and banks followed suit, and the New York stock market closed for 10 days. Wages plummeted, unemployment soared, real estate values sank, and the railroad industry just about collapsed. The political implications were just as important: Reconstruction was deemed a failure, monetary policy became a fixture of electoral politics for decades, and Republicans lost the south for an entire century.

Naturally, economic historians still debate the root causes of the Panic. Some blame Europe’s shift away from silver as a currency, which prompted the Coinage Act; others the rising bubble in railroad construction after the Civil War; and still others point to preceding crises like the Black Friday panic and the Great Chicago Fire.

But probably the biggest lesson for us moderns is perspective: While it’s easy to get depressed by the economic downturn of the last eight years, it pales in comparison to the Panic of 1873, which saw unemployment rise to 14%, a political backlash that championed protectionism for years to come, and federal troops sent across the states to subdue striking workers, killing scores of Americans.

Too big to be neutral

American non-interventionism has a long history, dating at least back to Federalist opposition to the War of 1812. But until the United States became a global superpower, George Washington’s farewell warning against “[entangling] our peace and prosperity in the toils of European ambition” was generally a hypothetical.

World War I made it a grim reality, and in the aftermath, isolationism became a political force to be reckoned with. On August 31st, 1935, Congress passed the Neutrality Act, the first in a series of arms and trade embargoes meant to keep Americans uninvolved with foreign nations at war.

President Roosevelt signed it, reluctantly, but soon found it useful: when Mussolini invaded Ethiopia, the Americans stayed on the sidelines, despite paying lip service to Ethiopian independence. 

Strategically, though, the Neutrality Acts were a mess. At first, they didn’t cover civil wars, so American firms easily made loans to General Franco’s Nationalist forces in Spain. But even when it was applied uniformly, after 1937, critics pointed out that a policy of neutrality might, in fact, tend to help the aggressors — who’ve likely, in advance, done more of the arms- and fundraising that embargoes try to prevent. At the time, in January 1937, The Nation called the policy “Pro-Fascist Neutrality”:

“The next war is upon us, and it is almost certain that it will be a war of aggression precipitated by Hitler with or without the aid of Mussolini. Ranged against the fascist powers will be France, the Soviet Union, and almost as surely England. It happens that England and France are much more dependent on supplies from the United States than Germany. Any announcement by the United States that it will not under any circumstances furnish the belligerent countries with the sinews of war is an open invitation to Hitler to launch his attack.”

By 1941, with the Lend-Lease program, neutrality was all but dead, and the US was soon to enter WWII. But the Neutrality Act should’ve taught Congress a valuable lesson: every intervention should be judged on it own merits. Even when we’ve been conscious of James Madison’s wisdom and tried to pretend the rest of the world didn’t exist, the mere fact that we’ve got a big economy has made that impossible. 

Memories of a Georgian August

As UkrainOrdzhonikidze,_Stalin_and_Mikoyan,e braces for a full-scale Russian ground offensive, this month brings up painful memories for another former Soviet republic. The one known to most of us, the South Ossetia War of August 2008, killed hundreds, displaced over a hundred thousand, and established Russian dominance in the contested region. But only the Georgians themselves will likely mark today’s 80th anniversary of the August Uprising, which was bloodier, and set the stage for over a half century of Soviet oppression.

1924 was ripe for a crisis. Georgian Social Democrats had been fomenting opposition to the Red Army for three years, and rough economic conditions were making it hard for the occupying Soviet forces to maintain popular support. In 1922, hard-line Bolsheviks — led by Joseph Stalin, a Georgian himself — won an intra-party dispute over how to deal with the Georgians’ aspirations of self-rule. The result was a merger of Georgia with Armenia and Azerbaijan into the Transcaucasian SFSR, and the suppression of the opposition, whose remnants the Soviets forced underground.

On August 28th, 1924, these underground Menshevik guerrillas launched an uprising in the Western Georgian town of Chiatura, and quickly placed several areas of the country under the control of a new interim government. But the rebels were no match for the the artillery and aircraft of the Red Army, and they fell in a few weeks, negotiating a settlement to prevent further loss of life.

Anyone familiar with the Red Terror of the years prior could have guessed what would come next: mass arrests, and mass murder. Estimates put the number of executions above 10,000 over a matter of weeks, and the deportations well above that. And, of course, consolidation of Soviet control simply made way for decades of repression, purges, and forced collectivization.

The political rhetoric of the August Uprising can sound all too familiar to that of Ukrainians today. Weaker states have long appealed to notions of “sovereignty” or self-rule to defend themselves against aggressive neighbors many times their size and strength. And, then as now, superpowers — whether Soviet or Russian — have used name-calling, propaganda, and promises of “liberation” to justify expansionist policies. Though the lyrics change, the song remains the same.

Doc Edgerton: Great American, Consummate Baller


I’ve long been fascinated by Harold Edgerton’s work, but it’s only in my latest explorations down the rabbit hole that I’ve learned just how many pioneering advances he made. In one way or another, his work made possible everything from fashion shoots to aerial reconnaissance.

It turns out that not only did Edgerton invent high-speed photography, he designed the modern camera flash altogether. He even, at the time of World War II, designed a greatly enlarged variation on it which, flanked by 500-pound capacitors, sat in the bomb bay of a modified B-18, enabling armies for the first time to take overhead photos at night.

Imagine a flash so powerful it brought noonday sun to huge swaths of land in the dark. Some intelligence chiefs doubted that it would work from thousands of feet in the air, so, being a consummate baller, he offered proof of concept, taking his boys for a little night flight over Stonehenge and scaring the bejeezus out of some cows.

His point proved, Edgerton went on to provide valuable service on the eve of the Allied invasion of Europe, when, on the night of June 5, 1944, his photos showed no movement of German forces around Normandy, indicating that the D-Day landing would come as a surprise.

After the war, of course, he designed the Rapatronic camera, which used magnetic fields instead of mechanical action to create a shutter with no moving parts, enabling people to photograph nuclear explosions at rates of over 1,000,000 frames/second.

Because when you’re Harold Edgerton, and someone asks you to photograph a subject 100 times brighter than the sun from seven miles away, you don’t even break a sweat.

Beating up on the nerds

Joel Kotkin pens a provocative critique of the new wealthy of Silicon Valley, and their growing political aspirations.

It’s a characteristically good piece of writing, full of much new reporting and many specifics. One still stretches to liken this lot of nerds to the industrialists at the turn of the 20th century, though. The Googlers’ private planes might keep people up at night; the Carnegiers’ private lake killed two thousand people.

The victims of the smartphone boom are real, and as Kotkin points out, Chengdu and Guangzhou aren’t represented in the U.S. Congress. But no one is about to stop using smartphones.

And while the techies’ endless tour of TV shows grows tedious, have they really succeeded in making many political changes at the expense of the public good?

Whatever their aspirations, neither the Googlers nor the Koch Brothers, George Soros, Chris Hughes, Shelly Adelson nor any of a dozen other favorite targets has managed to install a puppet regime in Washington.

So, a bunch of guys are throwing tasteless parties and trying to pay as little as possible in taxes. What else is new?