Archive for October 7, 2011

ANALYSIS-Sudan’s Bashir faces dire economic crisis

Posted: October 7, 2011 by PaanLuel Wël Media Ltd. in Junub Sudan

Source: reuters // Reuters

* North lost 75 pct of oil output when south seceded

* Fighting in southern border areas hits economy

* Food price inflation angers ordinary Sudanese

By Ulf Laessing

KHARTOUM, Oct 7 (Reuters) – Three months after the south seceded, Sudan’s economy is floundering, with rampant food inflation, lost oil revenue and costly military campaigns combining into a serious crisis for veteran President Omar Hassan al-Bashir.

Bashir has not faced a popular uprising like those that have deposed other Arab leaders this year, but ordinary people are fuming as prices of sorghum, a staple food, have doubled.

Khartoum lost most of its oil reserves when former civil war foe South Sudan became independent in July. The plunge in oil income, the mainstay of state coffers, has sent the Sudanese pound into free fall, driving up the cost of imports.

The north lost 75 percent of Sudan’s oil production of 500,000 barrels per day after South Sudan gained independence in July under a 2005 peace deal that ended decades of civil war.

Abda al-Mahdi, a former state minister of finance, said the economic crisis was very grave. "We’re suffering from inflation. Urgent measures are needed," he told Reuters.

Annual inflation hit 21 percent in August. On the black market, the pound is trading 60 percent below the official rate despite central bank dollar sales to bolster the local currency.

Central bank monthly reports give no figures for foreign reserves. The bank has said it sold $500 million in July alone.

Perhaps in desperation, the central bank governor asked his Arab colleagues in September to deposit $4 billion in the central bank and commercial lenders. None responded publicly.

Sudan is roiled by instability in the joint border area with the south. The army has battled rebels in South Kordofan for months. Fighting spread to nearby Blue Nile state last month.

Bashir, who seized power in 1989, has ruled out talks with insurgents, but the conflicts drain resources and stretch an army already fighting rebels in the western region of Darfur.


Sudanese households also feel the impact. Meat prices soared 41 percent in August because fighting disrupted transport links to the cattle markets of South Kordofan. The United Nations says grain harvests in the violence-hit states are now at risk.

"Blue Nile and South Kordofan are two of Sudan’s main sorghum-producing areas. The latest fighting coupled with erratic rainfall means next month’s harvest is expected to generally fail," the U.N. Food and Agricultural Organisation said this week. "The price of a 90 kg bag of sorghum, which cost 70 Sudanese pounds ($26) earlier this year, is now 140 pounds."

The Sudanese Consumer Protection Society, which staged a meat boycott for a few days last month, plans more protests against food inflation.

Analyst Ali Verjee at the Rift Valley Institute said the economic crisis was worsening, but was not yet as dire as during the hyperinflation of the 1990s, adding:

"As expected, the first quarter after secession has proved economically difficult for Khartoum. The depreciation of the currency and accelerating inflation is increasingly concerning."

The International Monetary Fund expects Sudan’s economy to shrink this year and next.

Experts have long urged Khartoum to prepare for the loss of southern oil, but the government has been in denial, blaming a U.S. trade embargo or insisting that all is under control.

After secession, Sudan’s parliament approved a budget based on unchanged oil revenue. But diplomats say southern oil sales — worth $2 billion until October — now go directly to Juba, while the small northern output mainly serves local consumption.

In September, the central bank governor said expenditures would have to be cut by more than 25 percent this year.

Officials hope gold exports will compensate, predicting an output of 74 tonnes in 2011, a target analysts say is out of reach — Sudan’s biggest mine produces just 2.3 tonnes a year.

Instead of pinning its hopes on gold, the government should focus on industry, agriculture and animal wealth, said Mohammed Siddiq, a Sudanese financial journalist.


Mahdi said things would get worse unless north and south agreed on sharing of oil revenues by the end of the month.

The landlocked south should pay transit fees for using northern oil export facilities, but has paid nothing yet, in the absence of an agreement, diplomats say.

More trouble looms in November when 40,000 Sudanese will head for the Muslim pilgrimage in Saudi Arabia, fuelling demand for dollars and piling more pressure on the pound.

Instead of devaluing to bridge the gap with black market rates, authorities threatened to punish moneychangers, which only stalled the dollar’s rise for a couple of days.

"They should have learned from previous crises that you won’t end the dollar scarcity by rounding up black market dealers," said a local economist.

Sudan hopes a conference in December sponsored by Norway and Turkey will drum up investors and help with debt relief — South Sudan refuses to shoulder any part of the $38 billion debt pile acccumulated by Sudan when it was united.

But Western powers, irked with Khartoum for seizing the disputed border region of Abyei in May and for fighting in South Kordofan and Blue Nile, may be reluctant to help.

Nor are they well disposed towards Bashir, who has been indicted for war crimes by the International Criminal Court.

"I fear those conflicts in the border will hold back the West," said Chris Philips of the Economist Intelligence Unit. (Editing by Alistair Lyon)

Soap for South Sudan: Nonprofit donates 10,000 bars to ministry

Posted: October 7, 2011 by PaanLuel Wël Media Ltd. in Economy

The tiny bar of soap you use during your hotel stay may be helping people thousands of miles away — in the new nation of South Sudan.

Derreck Kayongo, founder of the Global Soap Project, stands next to a South Sudanese flag during a visit to Tennessee, where he announced the donation of 10,000 bars of soap to The Sudan Project. (Photo provided)

The Atlanta-based nonprofit Global Soap Project recently donated 10,000 bars of soap — made from recycled hotel soap — to The Sudan Project, a mission outreach of the Mt. Juliet Church of Christ in Tennessee.

For families living on the equivalent of $1 per day in South Sudan and other African nations, soap is a luxury they can’t afford.

But soap also is “a first line of defense” against child mortality around the world, says Derreck Kayongo, founder of Global Soap. Kayongo was in Tennessee recently to announce the donation of soap to The Sudan Project. The soap totals about 2,500 pounds and will be shipped by Healing Hands International, a Nashville, Tenn.-based shipping and relief mission supported by Churches of Christ. Sudanese Christians will distribute the bars.

“It seems only logical that those who are trying to cleanse the souls of South Sudanese should also be distributing a product that will help cleanse their bodies of potential illness,” said Don Humphrey, coordinator of The Sudan Project. The ministry is building a ministry training school and medical clinic in the village of Parajok, South Sudan. (See my report from Parajok from our August issue, part of our ongoing Global South series.)

Kayongo is a native of Uganda, an African nation just a few miles south of Parajok. He thought of the idea for Global Soap in the early 1990s, when he first arrived in the U.S. and stayed at a hotel in Philadelphia, according to CNN:

He noticed that his bathroom was replenished with new soap bars every day, even though they were only slightly used.

“I tried to return the new soap to the concierge since I thought they were charging me for it,” Kayongo said. “When I was told it was just hotel policy to provide new soap every day, I couldn’t believe it.”

Kayongo called his father — a former soap maker in Uganda — and shared the experience.

“My dad said people in America can afford to throw it away. But I just started to think, ‘What if we took some of this soap and recycled it, made brand new soap from it and then sent it home to people who couldn’t afford soap?’ “

That’s exactly what Kayongo has done since 2009. Now the mission-minded entrepreneur is a top 10 finalist for CNN’s Heroes. (Click on the link to vote.)

Next Saturday, Oct. 15, is Global Handwashing Day. The event was launched in 2008 to “foster and support a global culture of handwashing with soap, shine a spotlight on the state of handwashing in every country and raise awareness about the benefits of handwashing with soap.”

Here’s some more information from the Global Handwashing Day website:

Handwashing with soap is the most effective and inexpensive way to prevent diarrheal and acute respiratory infections, which take the lives of millions of children in developing countries every year. Together, they are responsible for the majority of all child deaths. Yet, despite its lifesaving potential, handwashing with soap is seldom practiced and difficult to promote.

Turning handwashing with soap before eating and after using the toilet into an ingrained habit could save more lives than any single vaccine or medical intervention, cutting deaths from diarrhea by almost half and deaths from acute respiratory infections by one-quarter. A vast change in handwashing behavior is critical to meeting the Millennium Development Goal of reducing deaths among children under the age of five by two-thirds by 2015.

Global Handwashing Day focuses on children because not only do they suffer disproportionately from diarrheal and respiratory diseases and deaths, but research shows that children – the segment of society so often the most energetic, enthusiastic, and open to new ideas – can also be powerful agents for changing behaviors like handwashing with soap in their communities.

Other Related News

South Sudan’s president visits Khartoum on Saturday
Sudan Tribune
October 6, 2011 (KHARTOUM) — President of the Republic of South Sudan Salva Kiir will be in Khartoum on Saturday for talks on the post independence arrangements but also he will seek to appease crowing tensions with the northern neighbour.
See all stories on this topic »
South Sudan: Nation Becomes ITU’s 193rd Member State
Geneva — The International Telecommunication Union (ITU) is proud to announce that the world’s newest country, South Sudan, has joined ITU to become the Union’s 193rd Member State, effective from 3 October 2011. The country, which gained its
See all stories on this topic »
South Sudan: “Use the Media Wisely”, Advises Information Minister
Dr Marial made this appeal yesterday in his office when he met leaders of the opposition party, the SPLM-DC, led by the Official Opposition Leader in the South Sudan Legislative Assembly Hon Onyoti Adigo Nyikwec. The minister reiterated the
See all stories on this topic »
South Sudan: Ministry of Health Finalizes Arrangements for Anti-Malaria Campaign
Juba — The Ministry of Health has finalized the arrangements for the delivery and distribution of anti-malaria drugs to all the ten states of South Sudan, the minister, Hon Dr Michael Milli Hussein, has announced. The minister made the announcement
See all stories on this topic »
Research and Markets: Sudan – Telecoms, Mobile and Broadband – 2011
Bradenton Herald
Following a referendum, oil-rich South Sudan became the world’s youngest independent state in 2011. Having been beyond the central government’s control and deprived of development, it is establishing its own independent telecommunications regime,
See all stories on this topic »

South Sudan slams Hamas PM’s remark, demands apology
Sudan Tribune
October 6, 2011 (KHARTOUM) – The Republic of South Sudan has strongly censured remarks in which Ismail Haniya, Prime Minister of the ousted Hamas’s government in Gaza strip, reportedly described the newly independent country as “a foundling state.

Where Steve Jobs Ranks Among the Great Americans

Posted: October 7, 2011 by PaanLuel Wël Media Ltd. in Junub Sudan

By Rick Newman | US News

  • Rick Rycroft – A man uses his iPhone to photograph flowers and a photocopy image of Steve Jobs that is placed at the entrance at the Apple Store in Sydney, Thursday, Oct. 6, 2011. Steve Jobs, the Apple founder …more and former CEO who invented and masterfully marketed ever-sleeker gadgets that transformed everyday technology, from the personal computer to the iPod and iPhone, died Wednesday. (AP Photo/Rick Rycroft) less

He was indisputably a titan of the digital era. But how does Steve Jobs stack up against the greatest business leaders in American history?

We won’t really know for years, of course, since nobody’s sure where technology will lead or what his company, Apple, may still achieve. But Steve Jobs was clearly a visionary who changed much about the way people use technology. His death from pancreatic cancer at just 56 feels like a national loss. And he’s one of the few people in any field who can plausibly be compared with America’s greatest innovators. So it doesn’t seem too early to try.

Here’s my methodology: Instead of measuring the amount of wealth created, I’m more interested in the impact that innovators have had on life in America, on how they improved living standards, advanced the nation’s competitiveness and created opportunity for others. By that measure, Steve Jobs, for all his accomplishments, is up against a pantheon of epic overachievers.

Back in the 1700s, Benjamin Franklin, the quintessential American, helped form the ethos of the middle class–which he called “the middling people, the farmers, shopkeepers and tradesmen”–while serving as the conscience of the upstart nation through his publications, crafty diplomacy and deft political touch. Alexander Hamilton–who like Jobs, died young, at the age of 49–helped create the financial system that turned the United States from a banana republic into a stable nation global investors would be comfortable doing business with.

During the Industrial Revolution, pioneers like John D. Rockefeller and Andrew Carnegie helped the United States become one of the world’s mightiest economies–one that overtook Europe in the production of vital new materials like oil and steel. By the late 1800s, Thomas Edison developed an electric-lighting system that literally turned darkness to light and ushered in sweeping second- and third-order changes, from the improvement of working conditions in factories everywhere to safer homes no longer lit by candles. Edison also found time to invent the phonograph, the movie projector, and many other things, including a key modification to Alexander Graham Bell’s telephone that remained part of the basic design until the 1980s.

In the 20th century, Henry Ford brought the gilded luxury of personal transportation to virtually everybody with his mass-produced cars, an innovation that shifted whole population centers from city to suburb. The Wright Brothers invented airplanes that would eventually move people from city to city–then from continent to continent–in hours, an order-of-magnitude change in the timeliness with which business could be conducted. Walt Disney invented new forms of entertainment for increasingly prosperous people with the newfound luxury of leisure time. Sam Walton, who founded Wal-Mart, brought everyday low prices to millions of shoppers. Ted Turner, who started CNN, broadcast his splashy cable news all day long, breaking the networks’ monopoly on news and spawning an on-air information revolution.

When Steve Jobs and Steve Wozniak co-founded Apple in 1976, they set to work building a line of computers–culminating in the Macintosh–that would be the most intuitive machines of their kind. In a way, they introduced the middling people to the magic of digital processors the way Henry Ford introduced them to cars. The brash young Jobs left Apple in 1985, after a spat with the board over the company’s direction. By his own later admission, he needed a strong dose of perspective.

Jobs did other things for 12 years, until returning to Apple as CEO in 1997. The company was floundering, after a string of misfires. Jobs straightened things out, then brought Apple to new heights with wonders like the iPod, iPhone and iPad, along with services like iTunes and Apple TV meant to complement the elegant devices. By the time Jobs retired as CEO earlier this year, Apple was more valuable than virtually any other technology company in the world, including Google, IBM and Microsoft.

Jobs’s death has touched Apple customers, and many others, in a heartfelt way that’s unusual for a business leader–especially today. Encomiums have flowed from practically everybody with a blog or Twitter account. “He was our Thomas Edison and our Henry Ford, all in one brief life,” wrote political commentator David Frum in his Twitter feed, summarizing the thoughts of many.

But was he? Edison and Ford devised innovations so profound they transformed whole societies and materially improved the lives of people who never even purchased a Ford or Edison product. Edison lit public places, while also providing electricity that helped heat them and power other machines. The automobiles that rolled off Ford’s assembly lines swept putrid piles of horse manure off of urban streets and made cities more liveable. Edison and Ford, like other historical giants, created progress that could be measured every day in the humblest of homes, while also laying the foundation for entirely new industries.

If you’re an Apple customer, chances are you feel that Steve Jobs has done something similar for you. Apple products are famous for their user-friendliness and their ability to enhance productivity, whether through third-party apps or ingenious features like the iMovie software that lets amateurs create videos with a professional look and feel. Perhaps more than anything, Apple customers simply enjoy using their products, which takes the drudgery out of scanning spreadsheets or speed-reading emails. Nobody really says that about a Blackberry or a Hewlett-Packard PC.

But many Apple products remain high-end indulgences for people with the money to spend on an enhanced digital experience. Yes, Steve Jobs has done the masses a service by showing his utilitarian competitors how to devise an artful user interface, which usually trickles down to cheaper generic devices once Apple has moved on to version 4 or 5. But Macs and iPhones and iPads remain too pricey for many mainstream consumers, who might read about the wonders of Apple gizmos the way they read about luxury cars or fancy dinners: Sounds nice, and I hope I can afford one some day. Meanwhile, you’d have to stretch to define a way in which Steve Jobs has materially improved society, enhanced public life or broadly shared his gifts with people who can’t afford to be his customers. (Cue the outrage of Apple Nation.)

Jobs was truly a brilliant designer, marketer and technologist–all in one. But it’s worth keeping in mind that the digital revolution would have carried on without him. Robert Noyce and Gordon Moore, the founders of Intel, invented much of the circuitry that powered Jobs’s devices over the years, along with many other computing machines. Bill Gates developed software that has powered far more computers than Apple ever built. Larry Page and Sergey Brin, the co-founders of Google, have provided an Internet search service that’s arguably more useful to more people–for free–than anything Apple has rolled out. Jobs helped make the first 30 years of the mass-computing era colorful and even fun. But it didn’t take him to make it possible.

He did accomplish something, however, that’s rare in the annals of business history: He made consumers fall in love with his ideas and his products, and even with him. Jobs wasn’t a particularly likeable guy, by most accounts. He had a prickly demeanor and an I-know-better arrogance that would have been the downfall of a lesser visionary. Yet he leaves behind a vast army of Apple acolytes who may propel his ideas to heights beyond Jobs’s own reach. In the firmament of business giants, Steve Jobs shines medium-hot, like the sun. But like a few other geniuses who die too young, his star may get brighter the longer he is gone.

Twitter: @rickjnewman


Apple’s Visionary Redefined Digital Age

Published: October 5, 2011

Steven P. Jobs, the visionary co-founder of Apple who helped usher in the era of personal computers and then led a cultural transformation in the way music, movies and mobile communications were experienced in the digital age, died Wednesday. He was 56.

The death was announced by Apple, the company Mr. Jobs and his high school friend Stephen Wozniak started in 1976 in a suburban California garage. A friend of the family said the cause was complications of pancreatic cancer.

Mr. Jobs had waged a long and public struggle with the disease, remaining the face of the company even as he underwent treatment, introducing new products for a global market in his trademark blue jeans even as he grew gaunt and frail.

He underwent surgery in 2004, received a liver transplant in 2009 and took three medical leaves of absence as Apple’s chief executive before stepping down in August and turning over the helm to Timothy D. Cook, the chief operating officer. When he left, he was still engaged in the company’s affairs, negotiating with another Silicon Valley executive only weeks earlier.

“I have always said that if there ever came a day when I could no longer meet my duties and expectations as Apple’s C.E.O., I would be the first to let you know,” Mr. Jobs said in a letter released by the company. “Unfortunately, that day has come.”

By then, having mastered digital technology and capitalized on his intuitive marketing sense, Mr. Jobs had largely come to define the personal computer industry and an array of digital consumer and entertainment businesses centered on the Internet. He had also become a very rich man, worth an estimated $8.3 billion.

Tributes to Mr. Jobs flowed quickly on Wednesday evening, in formal statements and in the flow of social networks, with President Obama, technology industry leaders and legions of Apple fans weighing in.

“For those of us lucky enough to get to work with Steve, it’s been an insanely great honor,” said Bill Gates, the Microsoft co-founder. “I will miss Steve immensely.”

A Twitter user named Matt Galligan wrote: “R.I.P. Steve Jobs. You touched an ugly world of technology and made it beautiful.”

Eight years after founding Apple, Mr. Jobs led the team that designed the Macintosh computer, a breakthrough in making personal computers easier to use. After a 12-year separation from the company, prompted by a bitter falling-out with his chief executive, John Sculley, he returned in 1997 to oversee the creation of one innovative digital device after another — the iPod, the iPhone and the iPad. These transformed not only product categories like music players and cellphones but also entire industries, like music and mobile communications.

During his years outside Apple, he bought a tiny computer graphics spinoff from the director George Lucas and built a team of computer scientists, artists and animators that became Pixar Animation Studios.

Starting with “Toy Story” in 1995, Pixar produced a string of hit movies, won several Academy Awards for artistic and technological excellence, and made the full-length computer-animated film a mainstream art form enjoyed by children and adults worldwide.

Mr. Jobs was neither a hardware engineer nor a software programmer, nor did he think of himself as a manager. He considered himself a technology leader, choosing the best people possible, encouraging and prodding them, and making the final call on product design.

It was an executive style that had evolved. In his early years at Apple, his meddling in tiny details maddened colleagues, and his criticism could be caustic and even humiliating. But he grew to elicit extraordinary loyalty.

“He was the most passionate leader one could hope for, a motivating force without parallel,” wrote Steven Levy, author of the 1994 book “Insanely Great,” which chronicles the creation of the Mac. “Tom Sawyer could have picked up tricks from Steve Jobs.”

“Toy Story,” for example, took four years to make while Pixar struggled, yet Mr. Jobs never let up on his colleagues. “‘You need a lot more than vision — you need a stubbornness, tenacity, belief and patience to stay the course,” said Edwin Catmull, a computer scientist and a co-founder of Pixar. “In Steve’s case, he pushes right to the edge, to try to make the next big step forward.”

Mr. Jobs was the ultimate arbiter of Apple products, and his standards were exacting. Over the course of a year he tossed out two iPhone prototypes, for example, before approving the third, and began shipping it in June 2007.

To his understanding of technology he brought an immersion in popular culture. In his 20s, he dated Joan Baez; Ella Fitzgerald sang at his 30th birthday party. His worldview was shaped by the ’60s counterculture in the San Francisco Bay Area, where he had grown up, the adopted son of a Silicon Valley machinist. When he graduated from high school in Cupertino in 1972, he said, ”the very strong scent of the 1960s was still there.”

After dropping out of Reed College, a stronghold of liberal thought in Portland, Ore., in 1972, Mr. Jobs led a countercultural lifestyle himself. He told a reporter that taking LSD was one of the two or three most important things he had done in his life. He said there were things about him that people who had not tried psychedelics — even people who knew him well, including his wife — could never understand.

Decades later he flew around the world in his own corporate jet, but he maintained emotional ties to the period in which he grew up. He often felt like an outsider in the corporate world, he said. When discussing the Silicon Valley’s lasting contributions to humanity, he mentioned in the same breath the invention of the microchip and “The Whole Earth Catalog,” a 1960s counterculture publication.

Apple’s very name reflected his unconventionality. In an era when engineers and hobbyists tended to describe their machines with model numbers, he chose the name of a fruit, supposedly because of his dietary habits at the time.

Coming on the scene just as computing began to move beyond the walls of research laboratories and corporations in the 1970s, Mr. Jobs saw that computing was becoming personal — that it could do more than crunch numbers and solve scientific and business problems — and that it could even be a force for social and economic change. And at a time when hobbyist computers were boxy wooden affairs with metal chassis, he designed the Apple II as a sleek, low-slung plastic package intended for the den or the kitchen. He was offering not just products but a digital lifestyle.

He put much stock in the notion of “taste,” a word he used frequently. It was a sensibility that shone in products that looked like works of art and delighted users. Great products, he said, were a triumph of taste, of “trying to expose yourself to the best things humans have done and then trying to bring those things into what you are doing.”

Regis McKenna, a longtime Silicon Valley marketing executive to whom Mr. Jobs turned in the late 1970s to help shape the Apple brand, said Mr. Jobs’s genius lay in his ability to simplify complex, highly engineered products, “to strip away the excess layers of business, design and innovation until only the simple, elegant reality remained.”

Mr. Jobs’s own research and intuition, not focus groups, were his guide. When asked what market research went into the iPad, Mr. Jobs replied: “None. It’s not the consumers’ job to know what they want.”

Early Interests

Steven Paul Jobs was born in San Francisco on Feb. 24, 1955, and surrendered for adoption by his biological parents, Joanne Carole Schieble and Abdulfattah Jandali, a graduate student from Syria who became a political science professor. He was adopted by Paul and Clara Jobs.

The elder Mr. Jobs, who worked in finance and real estate before returning to his original trade as a machinist, moved his family down the San Francisco Peninsula to Mountain View and then to Los Altos in the 1960s.

Mr. Jobs developed an early interest in electronics. He was mentored by a neighbor, an electronics hobbyist, who built Heathkit do-it-yourself electronics projects. He was brash from an early age. As an eighth grader, after discovering that a crucial part was missing from a frequency counter he was assembling, he telephoned William Hewlett, the co-founder of Hewlett-Packard. Mr. Hewlett spoke with the boy for 20 minutes, prepared a bag of parts for him to pick up and offered him a job as a summer intern.

Mr. Jobs met Mr. Wozniak while attending Homestead High School in neighboring Cupertino. The two took an introductory electronics class there.

The spark that ignited their partnership was provided by Mr. Wozniak’s mother. Mr. Wozniak had graduated from high school and enrolled at the University of California, Berkeley, when she sent him an article from the October 1971 issue of Esquire magazine. The article, “Secrets of the Little Blue Box,” by Ron Rosenbaum, detailed an underground hobbyist culture of young men known as phone phreaks who were illicitly exploring the nation’s phone system.

Mr. Wozniak shared the article with Mr. Jobs, and the two set out to track down an elusive figure identified in the article as Captain Crunch. The man had taken the name from his discovery that a whistle that came in boxes of Cap’n Crunch cereal was tuned to a frequency that made it possible to make free long-distance calls simply by blowing the whistle next to a phone handset.

Captain Crunch was John Draper, a former Air Force electronic technician, and finding him took several weeks. Learning that the two young hobbyists were searching for him, Mr. Draper had arranged to come to Mr. Wozniak’s Berkeley dormitory room. Mr. Jobs, who was still in high school, had traveled to Berkeley for the meeting. When Mr. Draper arrived, he entered the room saying simply, “It is I!”

Based on information they gleaned from Mr. Draper, Mr. Wozniak and Mr. Jobs later collaborated on building and selling blue boxes, devices that were widely used for making free — and illegal — phone calls. They raised a total of $6,000 from the effort.

After enrolling at Reed College in 1972, Mr. Jobs left after one semester, but remained in Portland for another 18 months auditing classes. In a commencement address given at Stanford in 2005, he said he had decided to leave college because it was consuming all of his parents’ savings.

Leaving school, however, also freed his curiosity to follow his interests. “I didn’t have a dorm room,” he said in his Stanford speech, “so I slept on the floor in friends’ rooms, I returned Coke bottles for the 5-cent deposits to buy food with, and I would walk the seven miles across town every Sunday night to get one good meal a week at the Hare Krishna temple. I loved it. And much of what I stumbled into by following my curiosity and intuition turned out to be priceless later on.”

He returned to Silicon Valley in 1974 and took a job there as a technician at Atari, the video game manufacturer. Still searching for his calling, he left after several months and traveled to India with a college friend, Daniel Kottke, who would later become an early Apple employee. Mr. Jobs returned to Atari that fall. In 1975, he and Mr. Wozniak, then working as an engineer at H.P., began attending meetings of the Homebrew Computer Club, a hobbyist group that met at the Stanford Linear Accelerator Center in Menlo Park, Calif. Personal computing had been pioneered at research laboratories adjacent to Stanford, and it was spreading to the outside world.

“What I remember is how intense he looked,” said Lee Felsenstein, a computer designer who was a Homebrew member. “He was everywhere, and he seemed to be trying to hear everything people had to say.”

Mr. Wozniak designed the original Apple I computer simply to show it off to his friends at the Homebrew. It was Mr. Jobs who had the inspiration that it could be a commercial product.

In early 1976, he and Mr. Wozniak, using their own money, began Apple with an initial investment of $1,300; they later gained the backing of a former Intel executive, A. C. Markkula, who lent them $250,000. Mr. Wozniak would be the technical half and Mr. Jobs the marketing half of the original Apple I Computer. Starting out in the Jobs family garage in Los Altos, they moved the company to a small office in Cupertino shortly thereafter.

In April 1977, Mr. Jobs and Mr. Wozniak introduced Apple II at the West Coast Computer Faire in San Francisco. It created a sensation. Faced with a gaggle of small and large competitors in the emerging computer market, Apple, with its Apple II, had figured out a way to straddle the business and consumer markets by building a computer that could be customized for specific applications.

Sales skyrocketed, from $2 million in 1977 to $600 million in 1981, the year the company went public. By 1983 Apple was in the Fortune 500. No company had ever joined the list so quickly.

The Apple III, introduced in May 1980, was intended to dominate the desktop computer market. I.B.M. would not introduce its original personal computer until 1981. But the Apple III had a host of technical problems, and Mr. Jobs shifted his focus to a new and ultimately short-lived project, an office workstation computer code-named Lisa

An Apocalyptic Moment

By then Mr. Jobs had made his much-chronicled 1979 visit to Xerox’s research center in Palo Alto, where he saw the Alto, an experimental personal computer system that foreshadowed modern desktop computing. The Alto, controlled by a mouse pointing device, was one of the first computers to employ a graphical video display, which presented the user with a view of documents and programs, adopting the metaphor of an office desktop.

“It was one of those sort of apocalyptic moments,” Mr. Jobs said of his visit in a 1995 oral history interview for the Smithsonian Institution. “I remember within 10 minutes of seeing the graphical user interface stuff, just knowing that every computer would work this way someday. It was so obvious once you saw it. It didn’t require tremendous intellect. It was so clear.”

In 1981 he joined a small group of Apple engineers pursuing a separate project, a lower-cost system code-named Macintosh. The machine was introduced in January 1984 and trumpeted during the Super Bowl telecast by a 60-second commercial, directed by Ridley Scott, that linked I.B.M., then the dominant PC maker, with Orwell’s Big Brother.

A year earlier Mr. Jobs had lured Mr. Sculley to Apple to be its chief executive. A former Pepsi-Cola chief executive, Mr. Sculley was impressed by Mr. Jobs’s pitch: “Do you want to spend the rest of your life selling sugared water, or do you want a chance to change the world?”

He went on to help Mr. Jobs introduce a number of new computer models, including an advanced version of the Apple II and later the Lisa and Macintosh desktop computers. Through them Mr. Jobs popularized the graphical user interface, which, based on a mouse pointing device, would become the standard way to control computers.

But when the Lisa failed commercially and early Macintosh sales proved disappointing, the two men became estranged and a power struggle ensued, and Mr. Jobs lost control of the Lisa project. The board ultimately stripped him of his operational role, taking control of the Lisa project away from him, and 1,200 Apple employees were laid off. He left Apple in 1985.

“I don’t wear the right kind of pants to run this company,” he told a small gathering of Apple employees before he left, according to a member of the original Macintosh development team. He was barefoot as he spoke, and wearing blue jeans.

That September he announced a new venture, NeXT Inc. The aim was to build a workstation computer for the higher-education market. The next year, the Texas industrialist H. Ross Perot invested $20 million in the effort. But it did not achieve Mr. Jobs’s goals.

Mr. Jobs also established a personal philanthropic foundation after leaving Apple but soon had a change of heart, deciding instead to spend much of his fortune — $10 million — on acquiring Pixar, a struggling graphics supercomputing company owned by the filmmaker George Lucas.

The purchase was a significant gamble; there was little market at the time for computer-animated movies. But that changed in 1995, when the company, with Walt Disney Pictures, released “Toy Story.” That film’s box-office receipts ultimately reached $362 million, and when Pixar went public in a record-breaking offering, Mr. Jobs emerged a billionaire. In 2006, the Walt Disney Company agreed to purchase Pixar for $7.4 billion. The sale made Mr. Jobs Disney’s largest single shareholder, with about 7 percent of the company’s stock.

His personal life also became more public. He had a number of well-publicized romantic relationships, including one with the folk singer Joan Baez, before marrying Laurene Powell. In 1996, his sister Mona Simpson, a novelist, threw a spotlight on her relationship with Mr. Jobs in the novel “A Regular Guy.” The two did not meet until they were adults. The novel centered on a Silicon Valley entrepreneur who bore a close resemblance to Mr. Jobs. It was not an entirely flattering portrait. Mr. Jobs said about a quarter of it was accurate.

“We’re family,” he said of Ms. Simpson in an interview with The New York Times Magazine. “She’s one of my best friends in the world. I call her and talk to her every couple of days.”

His wife and Ms. Simpson survive him, as do his three children with Ms. Powell, his daughters Eve Jobs and Erin Sienna Jobs and a son, Reed; another daughter, Lisa Brennan-Jobs, from a relationship with Chrisann Brennan; and another sister, Patti Jobs.

Return to Apple

Eventually, Mr. Jobs refocused NeXT from the education to the business market and dropped the hardware part of the company, deciding to sell just an operating system. Although NeXT never became a significant computer industry player, it had a huge impact: a young programmer, Tim Berners-Lee, used a NeXT machine to develop the first version of the World Wide Web at the Swiss physics research center CERN in 1990.

In 1996, after unsuccessful efforts to develop next-generation operating systems, Apple, with Gilbert Amelio now in command, acquired NeXT for $430 million. The next year, Mr. Jobs returned to Apple as an adviser. He became chief executive again in 2000.

Shortly after returning, Mr. Jobs publicly ended Apple’s long feud with its archrival Microsoft, which agreed to continue developing its Office software for the Macintosh and invested $150 million in Apple.

Once in control of Apple again, Mr. Jobs set out to reshape the consumer electronics industry. He pushed the company into the digital music business, introducing first iTunes and then the iPod MP3 player. The music arm grew rapidly, reaching almost 50 percent of the company’s revenue by June 2008.

In 2005, Mr. Jobs announced that he would end Apple’s business relationship with I.B.M. and Motorola and build Macintosh computers based on Intel microprocessors.

His fight with cancer was now publicly known. Apple had announced in 2004 that Mr. Jobs had a rare but curable form of pancreatic cancer and that he had undergone successful surgery. Four years later, questions about his health returned when he appeared at a company event looking gaunt. Afterward, he said he had suffered from a “common bug.” Privately, he said his cancer surgery had created digestive problems but insisted they were not life-threatening.

Apple began selling the iPhone in June 2007. Mr. Jobs’s goal was to sell 10 million of the handsets in 2008, equivalent to 1 percent of the global cellphone market. The company sold 11.6 million.

Although smartphones were already commonplace, the iPhone dispensed with a stylus and pioneered a touch-screen interface that quickly set the standard for the mobile computing market. Rolled out with much anticipation and fanfare, iPhone rocketed to popularity; by the end of 2010 the company had sold almost 90 million units.

Although Mr. Jobs took just a nominal $1 salary when he returned to Apple, his compensation became the source of a Silicon Valley scandal in 2006 over the backdating of millions of shares of stock options. But after a company investigation and one by the Securities and Exchange Commission, he was found not to have benefited financially from the backdating and no charges were brought.

The episode did little to taint Mr. Jobs’s standing in the business and technology world. As the gravity of his illness became known, and particularly after he announced he was stepping down, he was increasingly hailed for his genius and true achievement: his ability to blend product design and business market innovation by integrating consumer-oriented software, microelectronic components, industrial design and new business strategies in a way that has not been matched.

If he had a motto, it may have come from “The Whole Earth Catalog,” which he said had deeply influenced him as a young man. The book, he said in his commencement address at Stanford in 2005, ends with the admonition “Stay Hungry. Stay Foolish.”

“I have always wished that for myself,” he said.


South Sudan becomes latest member of UN postal union

Posted: October 7, 2011 by PaanLuel Wël Media Ltd. in Junub Sudan

6 October 2011 – South Sudan has become the newest member of the United Nations Universal Postal Union (UPU), the agency announced today.

The country, which gained independence on 9 July, became the 192nd Member State to join the organization, which regulates international mail exchanges and makes recommendations to stimulate growth in mail, parcel and financial services.

“The UPU is pleased to welcome South Sudan as a member of the global postal family,” said its Director General, Edouard Dayan.

“The postal network is an important infrastructure that helps respond to inhabitants’ communication needs as well as a country’s socioeconomic development. We will be delighted to work with government and postal officials to help them develop their national postal network and provide technical and regulatory advice,” he said.

The membership became effective on Tuesday following an official request by the Government of South Sudan noting its membership of the UN, which is a prerequisite to join the UPU.

U.K. Pledges $31 Million to Help Wipe Out Guinea Worm Disease

Posted: October 7, 2011 by PaanLuel Wël Media Ltd. in Science

By Betsy McKay

The British government has pledged about $31 million to help eradicate guinea worm disease, a donation that public-health experts say will bring them close to finishing the job.

A quarter century ago, the crippling parasitic infection afflicted 3.5 million people a year in more than 20 countries. This year, there are expected to be just over 1,000 cases in four African countries. More than 98% of those cases are in South Sudan, with a few dozen in Ethiopia, Mali, and Chad.

Guinea worm disease is passed along when people drink water from sources containing water fleas that harbor guinea worm larvae. Once inside a human, the larvae spawn worms that can reach three feet in length. The worms incubate for a year and then emerge slowly through painful lesions. When people soak their lesion-covered limbs in water, the worms release larvae, starting the cycle all over again.

The 25-year-long push to eradicate guinea worm is championed by former U.S. President Jimmy Carter, whose Carter Center in Atlanta has led the effort. The donation from the U.K. Department for International Development will be made over four years to the Carter Center.

According to the center, the best way to eliminate the disease is to “prevent people from entering sources of drinking water with an emerging guinea worm and to educate households to always use household or pipe filters to sieve out tiny water fleas carrying infective larvae.”

Donald Hopkins, vice president for health programs for the Carter Center, said $275 million, donated by several governments, has been spent so far wiping out the disease. The U.K. donation will go toward the $75 million the Carter Center estimates is needed to get the job done and to verify eradication.

“We’re very close,” says Hopkins, who has been working on guinea worm eradication since 1980. “This is going to happen. I can’t predict when, but it will be soon.” The Carter Center’s goal is to break the cycle of disease transmission in South Sudan next year, with no cases reported in 2013, he says. It would take three years of no cases to certify that the disease has been wiped out.

The donation comes as the U.K. is growing foreign-aid donations while implementing belt-tightening elsewhere, said Annabelle Malins, British Consul General in Atlanta. “We hope this will be a major tipping point to provide for the full funding requirement” for guinea worm eradication, she said.

Guinea worm disease would be the second human disease to be eradicated after smallpox, and the first to be wiped out without a vaccine or medical treatment. The disease hurts local agriculture in particular as it cripples workers temporarily during planting or harvests.

Image: Associated Press

UK gives £20m to global war on guinea worm

By Charlie Cooper

Thursday, 6 October 2011

The fight to eradicate the gruesome and debilitating “guinea worm” disease, making it only the second in the world to be wiped out after smallpox, is on the verge of success after it secured £20m funding from the Government.

Guinea worm afflicted 3.5 million people across 21 countries in 1986, but thanks to a campaign launched that year by former US President, Jimmy Carter, it is now confined to South Sudan, Ethiopia and Mali, afflicting only 1,797 people last year.

The disease is contracted by drinking water contaminated with microscopic worm larvae, which grow up to a metre long and emerge about a year later from the afflicted person’s body through a blister in the skin.

Britain has now become the first state donor to fund the campaign, which could exacerbate the wrath of many on the right of the Conservative Party, who have privately expressed concern that the Government is spending too much on foreign aid. There is no known cure or vaccine but aid efforts have focused on providing drinking water filters and educating vulnerable populations about the dangers of drinking contaminated water.

The disease is usually non-fatal but causes extreme pain and leaves sufferers bedridden for weeks or months. If the eradication drive is successful, it will follow smallpox into history and the species that causes it will be declared extinct.

Jimmy Carter paid tribute to the UK’s “willingness and staying power” in supporting his campaign, which hopes to achieve its goal by 2015, and called on other donors to “match the UK’s efforts”. The funding pledge from the Department for International Development (DFID) is dependent on other donors providing the additional £40m needed to achieve the Carter Foundation’s goals.

Dr John Hardman, president of the Carter Foundation, praised the DFID for leading the developed world on international aid.

“We have had a strong partnership with DFID for years and to hear about this additional grant was music to our ears,” he said. “DFID exemplify how we can form partnerships to attack challenging problems and diseases in the developing world.”

The disease by numbers

99.95% The fall in sufferers from guinea worm disease over the past 25 years.

£60m The total amount of money the Carter Centre believes is needed to eradicate the disease forever.

£950m The amount of the DFID’s annual £8.1bn budget spent on health projects.

Jimmy Carter asks for cash to wipe out guinea worm


Former U.S. President Jimmy Carter is appealing for other donors to join Britain in a multi-million dollar campaign to wipe out guinea worm, a crippling and painful parasitic disease that now exists only in four African countries.

At a press briefing in London on Wednesday, British officials are expected to pledge 20 million pounds (US$31 million) over four years to the cause — but only if other donors also open their wallets.

The global campaign to eradicate guinea worm started in 1980, when there were about 3.5 million cases of the disease, also known as dracunculiasis, every year across Africa and Asia.

Since then, cases have dropped by more than 99 percent, but the disease remains a problem in South Sudan, Ethiopia, Mali and Chad. Last year, there were 1,797 cases.

The Carter Center and partners, including the World Health Organization and the U.S. Centers for Disease Control and Prevention, aim to get rid of guinea worm disease by 2015.

There is no treatment or cure; the disease is eliminated by stopping people from drinking dirty water and by preventing infected people from wading into water and spreading the disease. Health campaigns that focus on changing behavior are often more difficult to implement than those that rely on medicines or vaccines.

Smallpox is the only disease in history to have been eradicated, while another effort to get rid of polio is also ongoing.

People get infected with guinea worm when they drink water infected with the larvae of the parasite.

About a year after someone is infected, the spaghetti-like worm, which can grow up to 1 meter in length, bursts out of their foot. That painful process can take months, often leaves the patient bedridden, and involves winding the worm around a stick so it doesn’t break.

Guinea worm disease “prevents people from escaping poverty,” Carter said in a statement. “I welcome the challenge laid down by the British government. I call on other donors to match their efforts.”

Efforts to end worm disease get British boost
October 5th, 2011
01:49 PM ET

Britain will back a final push to wipe out a debilitating parasitic worm disease that is on the verge of worldwide eradication.

Former President Jimmy Carter, World Health Organization’s director-general Margaret Chan and British officials in London, announced Wednesday a new campaign to rid the world of the Guinea worm, making it the second disease to be eradicated.

The British government pledged about $30 million in eradication efforts. International Development Minister Stephen O’Brien and Carter emphasized the need for donors to match the funds to get rid of the guinea worm.

“The eradication of guinea worm is within our sights,” O’Brien said.  “But it does still remain unfinished business, mainly for the poorest people in remote regions of the remaining four endemic countries where the worm persists.”

The first disease to be wiped off the earth was smallpox, which was eliminated through vaccines.

Unlike smallpox, the Guinea worm disease is not fatal. But there is no treatment for it and there’s no vaccine to prevent infection either, according t to the Centers for Disease Control and Prevention.  This disease can, however, cause permanent disabilities to people, crippling their livelihood and local economies.

The key to eradicating the disease is access to clean water and changes in people’s behavior because the parasitic Guinea worm lives in stagnant water.  When a person drinks the contaminated water, the worm grows inside its human host for a year until it emerges through the skin, causing great pain and in some cases, infections. The worm has been described in the Bible and Ancient Egyptian and Greek texts.

Graphic: How the guinea worm infects a person

Today, the worm is far less pervasive.  Statistics from 2010 show that 1,797 cases remain in the world, in four countries: Ethiopia, Mali, Chad and mostly South Sudan.

“For most of the world, this is an invisible worm – out of sight, out of mind, because it affects the poorest of the poor, people living in remote, rural areas,” said Chan from the WHO.

Carter commended the British government for “its willingness and staying power to help eradicate this debilitating disease,” and called on donors to match their efforts.  The goal is to stop the transmission of the guinea worm before 2015.

Unlike diseases like HIV/AIDS, tuberculosis and malaria, guinea worm is a little known disease.  The Carter Center, based in Atlanta, Georgia, has led public health efforts tackling neglected diseases most Americans have never heard of.

“We have a policy at our center of undertaking difficult projects, quite often which no one else wants to adopt,” Carter said during the press conference.  “Perhaps one of the most vivid examples of this has been guinea worm.”

Since 1986, the center’s efforts have focused on health education, training of health workers and village volunteers who monitor and treat patients.  The center has also supplied simple tools for clean drinking water and village-based education on avoiding the disease.

The greatest threat remains in the world’s newest country, South Sudan, which has about 6,000 villages under surveillance by 12,000 health volunteers.

Calling the remaining cases “unfinished business,” O’Brien said health officials had reason for cautious optimism.  “We know the final mile can often be the longest part of the journey. ”

Jimmy Carter spearheads final drive to eradicate guinea worm disease

£60m needed to finish the job and wipe crippling condition from the planet

jimmy carter eradicate guinea worm

A guinea worm is extracted by a health worker from a child’s foot in Savelugu, Ghana. Photograph: Olivier Asselin/AP

The world is tantalisingly close to eradicating guinea worm disease, which would make it only the second disease of humans to be wiped from the planet, according to former US president Jimmy Carter.

Speaking in London alongside World Health Organisation director general Dr Margaret Chan, Carter, who has led the fight against the disease, said that around £60m more was needed to finish the job.

Since the Carter Centre took up the cause in 1986, almost every nation had eradicated the crippling and painful disease, said the former president. “It is likely by the end of this year we will have guinea worm in only one country – the newest one on earth – South Sudan,” he added.

In 1995 Carter personally negotiated a six-month ceasefire between northern and southern Sudan, in a successful attempt to reach remote villages where guinea worm larvae infest drinking water, causing immense suffering to some of the poorest men, women and children on earth.

“The Carter Centre’s programme is designed to go into the places where the needs are greatest and quite often where the needs are neglected by others,” said the former president. “We couldn’t get into southern Sudan because of the war.”

In 1995 the leaders of north and south agreed the longest-ever ceasefire in the conflict, enabling volunteers to reach remote rural villages. They knew, said Carter, that “guinea worm was a blight on the people. There was an inseparable connection between peace on the one hand and doing away with guinea worm on the other.” Carter eventually helped negotiate peace and his centre monitored the national elections in 2010 and the referendum on separation this year.

Since 1986, 3.5m cases of guinea worm disease in 21 countries have been reduced by 99.9%. Now there are fewer than 1,000 a year.

In 1979, while Carter was president, the eradication of smallpox was declared. That cost £195m and was achieved through mass vaccination – a feat that is being attempted in polio but which looks difficult to repeat with the increased movement of populations.

Guinea worm eradication, a generation later, has so far cost £250m and is close to being achieved without recourse to vaccination or treatments, because they do not exist. The disease is being prevented through the drilling of wells for uncontaminated water and education of those who live in remote rural villages. People have been taught to filter their drinking water through a small pipe, cheaply made and distributed, which removes the guinea worm larvae.

The effort to reach the remotest villages has paid dividends, said Carter. “When we go in to a place like South Sudan, we have personally trained about 12,000 local volunteers and taught them aspects of healthcare and about good water that is clean to drink. We have often been able to dig deep wells that are free from disease.”

There have been other benefits too. “In the rest of their lives, many have never known success. They have never attempted anything that really succeeded. Quite often their relationship to foreigners has comprised broken promises. When we go in and teach them how they can correct their own problem, they not only learn the rudiments of healthcare and sanitation but they learn how to be self-sufficient and gain self-respect,” he said.

Stephen O’Brien, international development minister, pledged on Wednesday the UK government would provide up to one-third of the funding needed for the campaign against the guinea worm. But the amount of the British donation is dependent on how much is put in by others – the Department for International Development will put in £1 for every £2 from elsewhere, he said.

O’Brien added that discussions were taking place with other donors, but that it would be premature to reveal their identities. “I very much hope they will produce a response to the challenge,” he said.

Congress strikes back against Obama’s child soldiers’ waivers

Posted: October 7, 2011 by PaanLuel Wël Media Ltd. in Socio-Cultural

Posted By Josh Rogin 091022_meta_block.gifWednesday, October 5, 2011


The Cable reported yesterday that President Barack Obama waived penalties on several countries that recruit child soldiers for the second year in a row. Today, lawmakers moved to ensure that the administration won’t keep funding governments that use child soldiers next year.

The administration waived penalties mandated under the Child Soldiers Protection Act (CSPA) against Yemen, Chad, and the Democratic Republic of Congo (DRC). The administration didn’t provide a justification for not penalizing South Sudan, because the 2011 Trafficking in Persons (TIP) report, which was released on June 27 and triggers the penalties, names “Sudan,” not “South Sudan,” as an abuser. South Sudan was declared independent on July 9, 12 days after the report came out.

“South Sudan wasn’t a country during the reporting period and isn’t subject to the CSPA; there are no penalties to waive under the law,” National Security Council spokesman Tommy Vietor told The Cable.

That explanation struck several congressional aides and human rights activists we spoke with today as too clever by half. After all, the TIP report was referring to use of child soldiers by the government of “Southern Sudan” and the Southern People’s Liberation Army (SPLA), which hasn’t stopped the practice and will receive $100 million of U.S. taxpayers’ money this year.

“They’re using a legal and technical loophole to continue to build up partnership with a government that needs to be reminded how serious this problem is,” said Sarah Margon, associate director for sustainable security and peace building at the Center for American Progress. “It’s exactly how not to establish the message that they need to set up their government with full respect for human rights and transparency.”

“At the time the TIP report came out, it was obvious South Sudan was going to be an independent country so any responsible person would have taken that into consideration,” one senior House aide told The Cable. “Apart from the law, the White House still had discretion to address the issue as a policy matter and it chose not to condition any of the aid on the SPLA completing its demobilization of child soldiers.”

The administration made the case that Chad has made sufficient progress on the child soldiers issue, and is no longer subject to penalties. “We’ve seen the government take concrete steps over the last year to implement policies and mechanisms to prohibit and prevent future government or government-supported use of child soldiers,” Vietor said.

“The U.N.’s Chad Country Task Force has reported no verified cases of child soldiers in 2011, and Chad has put in place safeguards to prevent further use or recruitment of child soldiers. The president’s reinstatement of assistance to Chad reflects this progress,” he explained.

But several activists noted that the United Nations and State Department both kept Chad on their list of countries violating international standards for child recruitment this year, and that international monitors’ limited access in Chad calls into question anybody’s ability to verify whether the government has stopped using child soldiers.

Several aides and activists were angry at the administration for failing to adequately consult or even inform them of the waivers before they were announced. Administration officials briefed congressional staffers and NGO leaders yesterday, and journalists not at all.

“It also says something about the State Department’s willingness to engage with civil society actors,” said Margon. “It’s a black mark on them in their ability to work with friends and allies on these issues. Why alienate the people who want to work with you on this stuff? It just doesn’t make any sense.”

Congress has no intention of letting this scenario play out again next year. Rep. Jeff Fortenberry (R-NE), vice chairman of the House Foreign Affairs Subcommittee on Africa, Global Health and Human Rights, successfully added an amendment to the Trafficking Victims Protection Act reauthorization bill today that would force the administration to give Congress 15 days notice before issuing waivers for the child-soldier penalties.

The amendment would also expand the law to include peacekeeping funds given to violator countries (such as Somalia), and force the White House to show that countries are making progress toward eliminating the use of child soldiers before receiving a waiver. Sens. Richard Durbin (D-IL) and John Boozman (R-AR) have already introduced a companion measure in the Senate.

Not all Capitol Hill staffers were completely unsympathetic to the administration’s arguments, however.

One Senate aide referred to the progress noted by the Obama administration in Chad and the partial cut of U.S. military assistance in the DRC as “welcome steps — steps that might not have occurred without the force of the Child Soldier Prevention Act,” noting that they “will require serious follow up attention.”

But overall, the administration’s roll out of the decision was panned by the NGO and human rights communities, which see the administration’s action as undermining the intent of the legislation.

“At a time when Congress is locked in one of the most difficult budget battles I’ve ever seen, it is shameful that a portion of federal funding continues to help support governments who are abusing children,” said Jesse Eaves, World Vision’s policy advisor for children in crisis. “This is a very weak decision by an administration paralyzed with inaction. And the worst part is that thousands of children around the world — not the politicians in the White House or the State Department — are the ones who will suffer.”