Archive for the ‘Science’ Category

By Abel Majur Leek, Bor, Jonglei State

corruption in rss

Corruption in South Sudan

December 12, 2015 (SSB)  —-  I have heard different stories about people who lost their phones either by their own carelessness or by no fault of theirs. I, also have lost my phone once, but the focus will be on ways to checkmate re-use of stolen phones. Imagine you trying to shop in a large mall or market and before you are done with your shopping, your phone is gone. The truth is that your phone can be stolen anywhere, office, bus, mall, stadium etc.

One thing we should know, for those who do not know, is that virtually every phone has an anti-theft feature. This anti-theft feature helps the owner of the phone to prevent the person who stole the phone from using it. It also allows the owner to send messages to the stolen phone to either block, delete stored data or damage the phone.


Ajang Barach-Magaar, Nairobi, Kenya

A selfie of Ajang Barach Daau

A selfie of Ajang Barach Daau

June 4, 2015 (SSB)  —  This article is generally addressed to the public, and in particular to the folks that are touted to be ardent apologists of feminism. For quite a long time now, I have had to humble myself and begrudgingly bottle up as crusaders of feminism defend it with an aesthetic devotion of a juvenile, mostly elsewhere.

The primary purpose of the article is to remind everyone of some well-established, brutal human behavioural realities. The biological discrepancies between the dominant man and sullen woman extend far beyond the anatomical aspects. There are marked physical and intellectual inequalities between the two genders.

The male superiority is sufficiently certain to serve as a point of departure from the spurious equality that people have arbitrarily imposed on modern societies. Feminism is a dangerous philosophy confounded by relative poverty of knowledge of evolutionary science.

In any human population, the female’s physical and intellectual inferiority is strikingly evident. Taxonomy of our distant posterity might eventually sub-divide Homo sapiens by placing women in a distinct, inferior sub-species. Some Scientists have even already suggested Homo frontalis for man, and Homo perietalis for woman. But let’s defer deliberations on that subject to a future date.

There may be a few exceptions, incidences where a woman exhibits venerable capabilities, vastly superior to the average man; evidence of this trend has been observed in studies involving the weak, emaciated and effeminate man, congenital idiot or a senile man. But as Gustave Le Bon brilliantly observed, this is as rare as the birth of any monstrosity – like the birth of a gorilla with two heads.

First of all, the behavioural patterns of the juvenile of both genders conspicuously resemble the woman during their early stages of development. As they hurtle headlong towards maturity, the male acquires superior intellectual and physical traits. The girl undergoes an arrested development. She continues to be timid, emotional and irregular in her actions. In fact, a mature woman is a stunted man! She is a replica of man’s past and a lower state of civilization.

As Gutave Le Bon succinctly quipped, women are closer to children and savages than they are to grown up civilized men. If you are a Dinka, then you must have witnessed a girl submitting to an elemental force during a marriage. Submission is a profound manifestation of inferiority. The explanation for the redundancy in women is encapsulated in our tortuous evolutionary history. The guiding generalities for our supremely disproportionate behavioural ramifications were the selective pressures.

The tasks of man have always involved complicated, high stake manoeuvres. Our ancestors used to hunt to mince off a living from their wild environment. As we all know, history is replete with innumerable instances where the hunter becomes the hunted. Our hunter progenitors had to balance between the risks of family starvation and avoid being devoured by the carnivals while scampering about on the remote Savana.

Taking this analogy as nicely elucidated by Charles Darwin, we can conclude that natural selection would have eliminated the weak, and allowed the strong to thrive. So a muscular, fast, intelligent hunter would be more apt to survive the hunt and bring back home the meal. This was in stark contrast to the woman who was exclusively restricted to a sedentary life of child-bearing and house-keeping.

Therefore, the woman’s brain & physique weren’t adequately challenged. Reports amongst South Sudanese Australians suggest that a substantial portion of women opt to offer child day care services. I’m genuinely not baffled by this widespread eventuality. For all practical purposes, even the woman of our modern era would optimally excel in small, repetitive role!

Secondly, there’s a stiffer competition for wives amongst men than amongst women in their bid to get husbands. To eclipse the chasing pack and carry off the prize (the wife), the man must not only be physically superior to his rivals, but also ought to be wittier. The man who is unfortunate enough to be deficient of these benevolent qualities have historically been dumped to the periphery of any society.

Through time, natural selection eventually ensures his descendants fizzle out of the gene pool, effectively rendering him extinct. While the man essentially requires these to gain a competitive edge, the woman only needs to be beautiful in order to summon his attention. She is also only tasked with choosing a suitable spouse, a far mundane challenge.

At this point dear reader, I must stop, for this is certainly not a scientific paper where you would expect a catalogue of postulations to serve as justifications behind the aetiology of the female’s inferiority. My argument was to demonstrate why feminism is a farce. In this paragraph, I would like to summarise my case. In 1860s (please refer to T. Bischoff’s work), Scientists performed series of craniometric experiments to determine the size of the woman’s brain.

The highest measurement ever obtained from a woman was 1,565 grams and it belonged to a woman who had killed her husband. Throughout human history, the woman has always played second fiddle to the man. Any deviation from this norm must be closely monitored. Too much of grandeur might be detrimental for some people. The women who have always proved superior to men have never made the requisite grade of good wives, or house-keepers.

Instead of lending a helping hand to the feminists, I feel that they deserve to be treated to a chorus of utmost contempt. Those who advance feminist views should be educated on the dangerous warmongering they are engaging in as equating a woman to a man would propagate perpetual elevation in the disintegration of family unit. The general implications of this should be better imagined than experienced.

In conclusion, the boy child is born brave, with all the hopes and responsibilities of the future. It is the man who is constantly active in combatting the hurdles of human advancement. Let the girl child remain weak and vulnerable. In evolutionary terms, education induces an almost insignificant transformation in anyone. Nothing will ever harmonise our biological disparities.

Formulating ideas for the welfare of women as a prelude to elevating them to equal societal status as men is an idea of a feeble logical maxim. This facet of activism is a sheer waste of everybody’s energy. Its proponents must know that there are phenomena that you cannot exonerate by just fighting.

The man will always be the serial paragon of wisdom in any human society. Nature has confined women to a position of permanent inferiority. Not even a professorial virtuosity can astutely counter the woman’s innate propensity of excelling in indolence, skewed logic and domination by emotions.

The author, Ajang Barach-Magaar, is a 4th Year Student of B.Sc. Biomedical Technology at the University of Nairobi, Kenya and can be reached through his email: Dau Barach <>

The opinion expressed here is solely the view of the writer. The veracity of any claim made are the responsibility of the author, not PaanLuel Wël: South Sudanese Bloggers (SSB) website. If you want to submit an opinion article or news analysis, please email it to SSB does reserve the right to edit the material before publication. Please include your full name, email address and the country you are writing from.

By Isaac Achol Malony

‘Only when the last tree has died
The last river has been poisoned
and the last fish has been caught
will we realised that we cannot eat money’

Based on the Cree saying of the wise Indian Scholar.

The initiative to establish the center for environment and Nile conservation has come as a result of strict observations made by scientists based in South Sudan on how rapidly the environment is being polluted. Well ware of the highest level of pollution being discharged directly into the river Nile and the surroundings; also well acquainted to long term pollution cases of rivers like the Ganges in India and the Ruhr in Europe and how the people and Governments there are now regretting such a state of being.

Hence we are coming up with a proposal to establish the Centre for environment and Nile conservation in South Sudan in order to preserve our environment and waters by teaching our people (public) and spread awareness to all before it’s too late to rescue the situation. There is a very high release of pollutants into the air and the water ways in South Sudan and up stream in Uganda.

The centre will focus on giving awareness on dangers of polluting the environment so badly so quickly to the current high levels of pollution within few years especially within the last decade and the speed with which it is escalating; which is quite alarming.

The targeted outcomes of this center will be to protect the abundant resources in the Sudd basin and other areas mainly for the seek of human race’s benefit worldwide and particularly the South Sudanese, Ugandans, Sudanese, Egyptians and others who depend or enjoy the Sudd basin and the Nile resources. Most of the pollutants going into the river are non degradable materials like PET bottles, glass materials, plastic bags, and all kinds of plastic materials from broken chairs, tables etc.

There are also many cases of observed and to some extent, unconfirmed sewage discharges into the river Nile and its tributaries especially small feeder streams in South Sudan that join it. The air in South Sudan is also being badly polluted within the vicinity of populated areas by burning tires, plastic bags and bottles anyhow; releasing possibly highly carcinogenic gases into the air of which people around the surroundings just keep breathing in with no choice at all. The said materials we are spoiling our environment with are successfully recycled in other parts of the World but such kind of practice is nonexistent in South Sudan, therefore worsening the situation and leaving the only option of dump everything everywhere with volumes increasing day by day.

We are seeing more water factories and other manufacturing properties coming up very rapidly in South Sudan, the oil sector is also seen to develop effectively in the next years to come with some number of refineries that may spring up, this raises a very worrying phenomenon of increased uncontrolled pollution of our environment.

The issue of generators is also disturbing with high levels of noise and smoke pollution. Although it’s well known in South Sudan that people have no choice for clean power sources like hydroelectricity electricity, solar (existent but very minimally used) and other clean sources, the population generally resorted to the use of the said generators, the public can be made aware of clean options like the solar energy which many over look in South Sudan.

Thus extensive teaching of the population can turn the scenario around. There is also an alarming manner in which petroleum waste is being disposed of here in south Sudan; this kind of waste is well known to have long term effect on the environment with tendency of being transferred to further locations by running water polluting large land surface and water sources from where it was dumped.

The Centre plans to liaise effectively with other world conservation and study centers, the government of South Sudan and great Lakes Countries, the UN Environmental body and other relevant institutions especially regionally. It will work hand in hand with the university of Juba or any designated University and other local institutions for effective outreach to the students studying in South Sudan so as for them to benefit from the teachings, deliver the message to the population and benefit from it as well so as some of them will take lead in environmental conservation in the near future.

The scientists involved being well aware of the kinds of pollutants affecting South Sudan, will take tours around the World in Countries of interest and in neighbouring Countries to acquaint themselves with better means of containing the pollution locally, they will also print campaign materials to spread environmental awareness among the communities and frame academic teaching materials to the students in South Sudan.

The group will also work hard to ensure effective running of this Centre.

Cde Isaac Achol Malony can be reached on:

By Malith Alier, Bor


January 8, 2015 (SSB) —  Our environment is the single most important surrounding everyone should all cherish and protect at all cost. The environment for anybody, who is not yet acquainted self with, is the natural world that exists around us. It includes plants, animals, water and even fellow human beings in the vicinity!

It is the external environment that helps shape us to be what we are. Your immediate environment provides you with food and other needs like trees for construction, firewood; charcoal and medicinal plants.

There is a growing concern around the globe that the natural environment is under threat by the man himself. Vast swathes of forests are being cleared for farming, logging and are also used as a source of energy particularly in the third world. In recent years you might have heard of the destruction of Amazon rain forest through logging by faceless multinational companies.

The sustained and continued destruction of natural environment around the world through the above means is the subject of disasters like tsunamis, hurricanes, ice melting that result in rise on sea levels and others. In general it is called climate change by scientists.

Neem tree

Neem tree

Azadirachta indica (Neemtree)

At the local level, our very environment is severely under attack by foreign hands in acquiescence by local authorities and folks. The Juba city was surrounded by thick forests in the sixties and seventies according to those who were here at the times. This is hardly the case today because of charcoal. The demand for energy in form of charcoal outweighs the supply particularly in towns and big cities in the country. Alternative sources of energy like natural gas are nonexistent despite South Sudan being a petroleum exporting country.

On a visit to Bor on several occasions, this writer witnessed two threats to our only reliable source of everything, the forest. The forest once again and as far as I remember is a source of energy, food, shelter and rain. The forest is further a purifier of oxygen for your own breath. It is the habitat for abundant animals. Fauna and flora coexist!

One of the real threats to our indigenous trees is the tree called neem, scientifically named Andirachta indica. The white people who did not have the interests of South Sudanese at heart introduced this poisonous plant to eliminate our indigenous forest in the long run. This is abundantly clear right now by the look of the City of Bor. The indigenous plants are no more in town.

The marauding neem trees are advancing to the heart of our forests by speed of thousand miles per day! The irony is that no one seems to notice it. A lot of people take slight notice about it but take no action even those in authority seem to be oblivious about it. Note as well that it is not only Bor which is under invasion by neem trees, other twons and cities are also affected in the same way.

The number two threat is the usual clearing of forests with impunity by foreigners who have no knowledge about important indigenous plants that the locals can’t dare to cut. These include lalop or thou, acuil, lang, cum, luta and many more. These foreign folks have nothing to lose because by the end of it all they will go back to their countries after exhausting our God given green forests.

These foreigners in question include Ugandans, Sudanese from Darfur and others. These people have rules in place in their countries about where, when and what kind of trees to fell for charcoal. The Sudanese from Darfur are however, people of the desert and therefore, have no regard to forests whatsoever. It is therefore, inconceivable to entrust them with invaluable forests like ours.

It is only in South Sudan where foreigners do as they wish. They bring seeds of unknown plants and introduce them at any place as they please. They fell, indigenous and sacred trees mercilessly without planting replacement. This is to say that the basics of environmental sustainability are swept under the care in the dormancy of environmental protection agencies.

The world of the environmental protection agencies is a dormant universe. The Ministry of environment is doing no meaningful work just like the ministry of Interior which allows foreigners to enter and do anything at will.

The inaction of both ministries puts the country at the mercy of foreign hands all the time. We haven’t realised that missed opportunities are difficult to reverse. The foreign nationals will take as a right to be in South Sudan. The Arabs of the north were hard to dislodge from the country. Much bloodshed was the result that eventually forced them out paving the way for 2011 independence. Event after independence similar problems of foreign influx continue to dog this country.

The East African boda boda operators caused headache between South Sudan and Uganda. Last year, the labour ministerial order barring foreigners from holding certain positions in the country was another example of this kind. Foreigners have taken it upon themselves not to be bound by rules and regulations of the sovereign South Sudan. It the same mentality since colonisation and it is we who continue to suffer.

Besides the above problems the environment faces is the usual pollution by plastic carriers banned during the Kuol Manyang’s term as governor of Jonglei. Everything plastic carrier is now back to the State’s capital after the takeover by another Caretaker Governor. Though banning of plastic containers was not the right solution it however helped in a city where garbage collection is rudimentary at best.



A proactive approach needs to be devised right away if the current disaster is to be totally reversed. Those who are actively involved in clearing bushes should be licensed and be educated on what trees to chop and at what time. Not only that, they should be required to plant replacement trees in a certain period. This should be a condition for licensing. Desertification as a result of random clearance of forests is not an option here. A country with green forests is blessed.

Imagine a situation where South Sudan is a desert. Our past liberation wars would have been disastrous like the case in Darfur. The bushes of our forests provided cover for the Anya nya and the SPLA fighters. The wild fruits provided food for all of us during those difficult times of the two wars. The forests provide protection against erosion of soil through rain or wind. The southern forests attract timely rains every year on end. These are some of the reasons that should compel us to protect our forests.

All hope is not lost in the fight against environmental destruction. There are people who have realised the importance of preserving our environment. Although there are no credible environmental activists like the late Wangari Mathaai of Kenya, everybody who have realised the poor state of our environment plant a tree or two on their home fronts. This will in the long run alter the situation for better. Please plant mango, guava, lemon and other edible plants at home or in your gardens if you can. The end result will be amazing and South Sudan will remember you for that.

There is one farmer in Bor who is always in the news for making the difference. He started planting the above trees since arrival of the liberation forces in 2005. He is earning big in the fruit market monopolised by him alone. He is called Paul Alim Amol. He is known in the whole country.

new year

Happy the new year 2015 to all of you who will take the message of environmental protection as one of your New Year resolutions. Plant at least one tree for the New Year.

Smoking Ban

The Relativity of Wrong

Posted: November 6, 2013 by PaanLuel Wël in Philosophy, Science, Technology

The young specialist in English Lit, having quoted me, went on to lecture me severely on the fact that in every century people have thought they understood the universe at last, and in every century they were proved to be wrong. It follows that the one thing we can say about our modern “knowledge” is that it is wrong. The young man then quoted with approval what Socrates had said on learning that the Delphic oracle had proclaimed him the wisest man in Greece. “If I am the wisest man,” said Socrates, “it is because I alone know that I know nothing.” the implication was that I was very foolish because I was under the impression I knew a great deal. My answer to him was, “John, when people thought the earth was flat, they were wrong. When people thought the earth was spherical, they were wrong. But if you think that thinking the earth is spherical is just as wrong as thinking the earth is flat, then your view is wronger than both of them put together.” The basic trouble, you see, is that people think that “right” and “wrong” are absolute; that everything that isn’t perfectly and completely right is totally and equally wrong. However, I don’t think that’s so. It seems to me that right and wrong are fuzzy concepts, and I will devote this essay to an explanation of why I think so.

As Interest Fades in the Humanities, Colleges Worry

Many do not understand that the study of humanities offers skills that will help them sort out values, conflicting issues and fundamental philosophical questions, said Leon Botstein, the president of Bard College. “We have failed to make the case that those skills are as essential to engineers and scientists and businessmen as to philosophy professors,” he said.

Science Is Not Your Enemy

Scientific ideas and discoveries about living nature and man, perfectly welcome and harmless in themselves, are being enlisted to do battle against our traditional religious and moral teachings, and even our self-understanding as creatures with freedom and dignity. A quasi-religious faith has sprung up among us—let me call it “soul-less scientism”—which believes that our new biology, eliminating all mystery, can give a complete account of human life, giving purely scientific explanations of human thought, love, creativity, moral judgment, and even why we believe in God. … Make no mistake. The stakes in this contest are high: at issue are the moral and spiritual health of our nation, the continued vitality of science, and our own self-understanding as human beings and as children of the West. 

Crimes Against Humanities

The question of the place of science in knowledge, and in society, and in life, is not a scientific question. Science confers no special authority, it confers no authority at all, for the attempt to answer a nonscientific question. It is not for science to say whether science belongs in morality and politics and art. Those are philosophical matters, and science is not philosophy, even if philosophy has since its beginnings been receptive to science. Nor does science confer any license to extend its categories and its methods beyond its own realms, whose contours are of course a matter of debate. The credibility of physicists and biologists and economists on the subject of the meaning of life—what used to be called the ultimate verities, secularly or religiously constructed—cannot be owed to their work in physics and biology and economics, however distinguished it is. The extrapolation of larger ideas about life from the procedures and the conclusions of various sciences is quite common, but it is not in itself justified; and its justification cannot be made on internally scientific grounds, at least if the intellectual situation is not to be rigged. Science does come with a worldview, but there remains the question of whether it can suffice for the entirety of a human worldview. To have a worldview, Musil once remarked, you must have a view of the world. That is, of the whole of the world. But the reach of the scientific standpoint may not be as considerable or as comprehensive as some of its defenders maintain.

Is the ‘Dumb Jock’ Really a Nerd?

In the frequent debates over the merits of science and philosophy, or the humanities in general, it is often assumed that the factual grounding and systematic methodology of the sciences serve as a corrective to the less rigorous wanderings of the humanities. And while many take the position that the humanities can provide their own important path to enlightenment, few argue that considerations from philosophy can or should correct the considered judgment of scientists. Even most defenders of the humanities hold that the sciences are directed at truth, whereas the humanities have an alternate goal, perhaps the molding of ideal citizens.

The Triumph of Humanity

Posted: September 13, 2013 by PaanLuel Wël in Science

Voyager-1 will not approach another star for nearly 40,000 years, even though it is moving at 45km/s (100,000mph). “Voyager-1 will be in orbit around the centre of our galaxy with all its stars for billions of years,” said Prof Stone.Voyager 2 launched on 20 August 1977; Voyager 1 lifted off on 5 September the same year

  • Their official missions were to study Jupiter and Saturn, but the probes were able to continue on
  • The Voyager 1 probe is now the furthest human-built object from Earth
  • Both probes carry discs with recordings designed to portray the diversity of culture on Earth

#Nasa's Voyager-1 spacecraft has become the first manmade object to leave the Solar System</p>
<p>Scientists say the probe's instruments indicate it has moved beyond the bubble of hot gas from our Sun and is now moving in the space between the stars.</p>
<p>Launched in 1977, Voyager was sent initially to study the outer planets, but then just kept on going.</p>
<p>More on Voyager: </p>
<p>-Through the door to eternity<br /></p>
<p>- Jonathan Amos: Voyager-1 departs to interstellar space<br /></p>
<p>- How long would it take you to travel as far as Nasa's Voyager?<br />

#Nasa’s Voyager-1 spacecraft has become the first manmade object to leave the Solar System

The Wonders of the Universe (Video).

Posted: April 8, 2012 by PaanLuel Wël in Commentary, Science

Easter Season–There is more to the universe than the crucified Jesus!!!

Human fascination with the wonders of the universe is only rival by the infiniteness and queerness of the universe itself!! Join human imagination on the “Journey to the Edge of the Universe” by the National Geographic Channel. In the immortal words of the great English scientist, Sir Isaac Newton, human race is like “a boy playing on the sea-shore while the great ocean of truth lay all undiscovered before” us:

1. Journey to the edge of the Universe

2. Brief history of time

3. Space Odyssey: Voyage To The Planets 1 – BBC

4a. Life, the Universe and Everything – The Story of God – BBC

4b. No God but God

5. The Universe – Strangest Things In The Universe | Full Documentary

6. History Channel Albert Einstein (complete)

7. National Geographic – Inside the Milky Way (2010)

8. Earth: Making of a Planet – National Geographic Channel

9. Solar Empire – Alien Neighbors Full Episode


And here, stone age man, short-circuiting evolution.


PaanLuel Wel.

murdered child

By Dennis

After 21 years of civil war in Sudan where millions of lives were lost, we would imagine that the most logical programme for the world’s youngest nation—South Sudan, would be one that promotes population growth to replace the lost lives. South Sudan, a country twice the size of UK, and population estimated at less than 10 million, is not ripe for abortion.

Abortion is illegal in South Sudan and any organization or individual promoting abortion is promoting an illegality. However, despite what the law says and the knowledge that abortion is illegal in South Sudan, Marie Stopes International (MSI) opened a clinic in Juba (Hai Negli Area, East of Juba University) in July 2010. The opening ceremony was a low key event, intentionally designed not to attract media attention as it was soon after they had been kicked out of Khartoum. Their assets, including brand new cars, were taken by Humanitarian Commission (HAC) and all staffs paid six months’ salary to disappear and not talk to the media.

Internationally, MSI is known to provide abortions especially in countries where abortion is legal (UK, South Africa etc). On its official website in the UK, MSI describes itself and the kind of services it offer to the clients:

“Marie Stopes International is the UK’s leading provider of sexual and reproductive healthcare services. Our nationwide network of sexual health clinics see over 100,000 men and women each year who come to us for information, advice and professional care. We are committed to providing all the help you need to make informed choices about your health. We ensure that there is no wait for appointments and that our centres are safe, friendly, and comfortable places to visit. Our services include: Contraception, Unplanned pregnancy counseling, Abortion information and advice, Help for women needing abortions, Abortion pill and other treatment options, Vasectomy information and vasectomy procedure, Female sterilization, Health screening for men and women, Company health screening and STI testing.”

Even in countries where abortion is illegal, the practice continues but is disguised as provision of Family planning services. That is how they took root in Juba, South Sudan and in Khartoum previously.

Taking the case of MSI Juba Clinic for instance, immediately after launching, 16 staffs were recruited to do underground mobilization, targeting young girls at the University of Juba and its environs. The message was clear: tell them there are family planning services and other services like those in neighboring countries (In Uganda and Kenya, MSI is known for this practice and they were targeting those who could have heard about it such as returning refugees).

Realizing the targets were not being met, MSI decided to recruit and bring in a new country director, one who had the credentials of increasing sales in record time. And this time the sales were measured in terms of the highest number of abortions in the shortest time possible (year one target was 1500).


Hardly a year into the operations, the first victim of the ‘safe abortion’ was recorded. However, the case had to be killed without being picked up by the media and the police. The family based in Ethiopia was paid and all expenses related to hospitalization and burial catered for by MSI.

Shortly thereafter, another case arose. A man who was certain that his wife was pregnant came back from Yambio only to find that the wife was no longer pregnant. Upon severe battering, the wife admitted that she had procured an abortion at MSI clinic in Juba. The man had to be paid to kill the story, and thus, a high delegation was sent from London since they were getting concerned that their young project might go the Khartoum way (In reference to how they were kicked out of Khartoum)

Late last year (2011), the third victim was a young girl probably in high school. The pregnancy was more than 12 weeks and the doctor on duty refused to perform the operation. He was sacked for this but the saddest part was that the young girl passed away.


Using the NGO tag, MSI is freely getting supplies from UNFPA, MSH and Ministry of health in Juba. As you can see, Family planning commodities are given to them freely and instead of providing free services as it should be, the commodities end up being sold to the poor victims!

Memorandum of Understanding (MOU):

MOU with the national Ministry of health and the State Ministry of Health: In order to get the MOUs signed with the government as is the norm in Juba, long lofty letters and communication is sent back and forth to London to ensure nothing sensitive passes or can be picked up by the government authorities. If only the government bodies were to do due diligence then the real business of the clinic could have been un-earthed.

Smuggling of Abortion Drugs:

Abortion drugs which are not registered in Juba find their way through parcels and staff hand bags. Any staff travelling to London or from London will be delivering of picking orders. Since they know that the scanning at Juba International Airport is not serious, they pass scot free and duty and tax free!

Training of Staff:

Capacity building is one of the core pillars of any programme and true to that MSI ensures the staffs are taken out of Juba especially to Kenya, Ghana and South Africa where they can get many women coming for abortion to practice. The flip side is, instead of focusing on Family planning as indicated in the visas, the whole two weeks training is on abortion and those missing the real benefit to the deserving communities. It would be worth money to train midwives to improve the maternal mortality situation rather than focus on abortions.

Incriminating Documents:

Documents in our possession indicate clearly the objectives of the clinic. To protect the operations, most documents are marked classified or confidential. These are never shared with outsiders. In the documents, you can clearly see what is written (different words are used to cover up the real deal, i.e. MSP and MSMP: Basically Abortions without medical assistance and with medical assistance respectively).

8 attachmentsDownload all attachments

Annex Three External Relations Guidance Summary 29 July 2011.docx Annex Three External Relations Guidance Summary 29 July 2011.docx
343K View Download
Copy of South Sudan Logframe, Workplan & KPIs 08 11 11.xlsx Copy of South Sudan Logframe, Workplan & KPIs 08 11 11.xlsx
38K View Open as a Google spreadsheet Download
Country Design South Sudan Short - Dr Baba Final.docx Country Design South Sudan Short – Dr Baba Final.docx
64K View Download
Interview questions for clinical staff.docx Interview questions for clinical staff.docx
17K View Download
Interview Schedule for clinical Staff.docx Interview Schedule for clinical Staff.docx
18K View Download
PMDUP Start Up Guidelines 29 July 2011-1.doc PMDUP Start Up Guidelines 29 July 2011-1.doc
431K View Download
Result Scan-Juba November 2011.xls Result Scan-Juba November 2011.xls
399K View Open as a Google spreadsheet Download
Revised order form-MSI-SS.xls Revised order form-MSI-SS.xls
302K View Open as a Google spreadsheet Download

Interviews and Recruitment:

Having operated for one year without being un-earthed, the strategy is to move to other good areas (potential business). The major towns identified so far are Torit, Nimule, Yambio, Yei, Kajo Keji, and possibly Mundri.

When staffs are being recruited, the only thing that matters is the pro-abortion answer. For example, even if you are the most qualified candidate but you are against abortion, then you have no chance of being recruited. Due to lack of adequate job opportunities in South Sudan, most people would grudgingly accept that they can perform abortions only to realize that it is a bloody and murderous affair.

According to one of the providers: “I saw light after being presented with a case where the child came out still alive, I went home, confessed to a priest and promised myself never to be involved again.” That’s how she left working for MSI. Her contract was terminated immediately.

The Big Question is….

The big and urgent question is: for how long will this carnage continue? Or if it’s the best thing to have ever happened for our girls and women, then let the government openly announce that they want to, or have already, legalize abortion. In that case, let the MSI clinic and other clinics operate freely and competitively!

The citizens of South Sudan, as well as the government, must be made aware of the illegal activities of MSI in Juba. The government must wake up and declare which side it is on, MSI or the law!! If the law had changed, then that should be made explicitly clear to all citizens of our country so that there should be accompanying laws to safeguard and protect the lives of our womenfolk at the MSI clinics and to punish any negligence and cover-ups going on there.

The simple, yet the fundamental, question is this: under whose (and what) law are MSI clinics operating? South Sudanese must know because their love ones are dying under mysterious circumstances, all under the nose of a seemingly dysfunctional government in Juba and beyond the reach of the law!

Surely, South Sudanese deserve better than this!!

Dennis is a concerned citizen of South Sudan from Juba.

First, They Came for the Catholics

Michelle Malkin’s column is released once a week.

Michelle MalkinBy Michelle Malkin | Michelle Malkin – 

President Obama and his radical feminist enforcers have had it in for Catholic medical providers from the get-go. It’s about time all people of faith fought back against this unprecedented encroachment on religious liberty. First, they came for the Catholics. Who’s next?

This weekend, Catholic bishops informed parishioners of the recent White House edict forcing religious hospitals, schools, charities and other health and social service providers to provide “free” abortifacient pills, sterilizations and contraception on demand in their insurance plans — even if it violates their moral consciences and the teachings of their churches.

NARAL, NOW, Ms. Magazine and the Feminist Majority Foundation all cheered the administration’s abuse of the Obamacare law to ram abortion down pro-life medical professionals’ throats. Femme dinosaur Eleanor Smeal gloated over the news that the administration had rejected church officials’ pleas for compromises: “At last,” she exulted, the left’s goal of “no-cost birth control” for all had been achieved.

As always, tolerance is a one-way street in the Age of Obama. “Choice” is in the eye (and iron fist) of the First Amendment usurper.

Like the rising number of states who have revolted against the individual health care mandate at the ballot box and in the courts, targeted Catholics have risen up against the Obamacare regime. Arlington (Va.) Bishop Paul Loverde didn’t mince words, calling the U.S. Department of Health and Human Services order “a direct attack against religious liberty. This ill-considered policy comprises a truly radical break with the liberties that have underpinned our nation since its founding.” Several bishops vowed publicly to fight the mandate.

Bishop Alexander Sample of Marquette, Mich., asserted plainly: “We cannot — we will not — comply with this unjust law.”

It’s not just rabid right-wing politicos defying the Obama machine. Pro-life Democratic Sen. Bob Casey of Pennsylvania denounced the “wrong decision.” Left-leaning Bishop Robert Lynch threatened “civil disobedience” in St. Petersburg, Fla., over the power grab. Lefty Washington Post columnist E.J. Dionne wrote that Obama “botched” the controversy and “threw his progressive Catholic allies under the bus” by refusing to “balance the competing liberty interests here.”

White House press secretary Jay Carney blithely denied on Tuesday that “there are any constitutional rights issues” involved in the brewing battle. Yet, the Shut Up and Hand Out Abortion Pills order undermines a unanimous Supreme Court ruling issued just last week upholding a religious employer’s right to determine whom to hire and fire. And two private colleges have filed federal suits against the government to overturn the unconstitutional abortion coverage decree.

Hannah Smith, senior counsel at the nonprofit law firm The Becket Fund, which is representing the schools, boiled it down for Bloomberg News: “This is not really about access to contraception. The mandate is about forcing these religious groups to pay for it against their beliefs.”

How did we get here? The first salvo came in December 2010, when the American Civil Liberties Union pushed HHS and its Planned Parenthood-championing secretary, Kathleen Sebelius, to force Catholic hospitals to perform abortions in violation of their core moral commitment to protecting the lives of the unborn.

The ACLU called for a litigious fishing expedition against Catholic hospitals nationwide that refuse to provide “emergency” contraception and abortions to women. In their sights: Devout Phoenix Catholic Bishop Thomas Olmsted, who revoked the Catholic status of a rogue hospital that performed several direct abortions, provided birth control pills and presided over sterilizations against the church’s ethical and religious directives for health care.

The ACLU and the feminists have joined with Obama to threaten and sabotage the First Amendment rights of religious-based health care entities. The agenda is not increased “access” to health care services. The ultimate goal is to shut down health care providers — Catholic health care institutions employ about 540,000 full-time workers and 240,000 part-time workers — whose religious views cannot be tolerated by secular zealots and radical social engineers.

Is it any surprise their counterparts in the “Occupy” movement have moved from protesting “Wall Street” to harassing pro-life marchers in Washington, D.C., and hurling condoms at Catholic school girls in Rhode Island? Birds of a lawless, bigoted feather bully together.

Michelle Malkin is the author of “Culture of Corruption: Obama and his Team of Tax Cheats, Crooks & Cronies” (Regnery 2010). Her e-mail address is


‘Don’t waste time, there is no God’

It will remain one of the abiding ironies of the Jaipur Literature Festival: For an event that got the loudest publicity because of the threats of ungodly communal trouble, a number of sessions were dedicated to denouncing god and religion. The most strident among the speakers on the subject, however, was Richard Dawkins, author of the bestseller The God Delusion.

At one of his three sessions at JLF, where he read out from his latest work The Magic of Reality in tandem with his wife and actor Lalla Ward, the British author said, “Religious faith deserves a whole chapter in war technology, alongside tanks and guns… Religiosity usually recedes with the advancement of knowledge. In the fullness of time, we may see the death of all religions.” He followed the broadside with a statement in support of Salman Rushdie, which he said was a modified version of what he had written at the time of the earlier fatwa against the author. Dawkins’s eloquence on the “magic of truth in science” led a member of the audience to accuse the author of “fostering a religion of science”.

It’s a line of counter-attack the 70-year-old Dawkins is familiar with. The evolutionary biologist has been called ‘Darwin’s Rottweiler’ for his passionate defence of the subject in works such as The Selfish Gene, the bestseller from which he read out in another session at the festival. Not long ago, he won a bruising battle against the British government to keep out religious Creationist myths from school science texts.

Off stage, when asked whether he would want to be immortal with the help of science, Dawkins said, “Eternity is a frightening thought whether you are in it or not. I can only go through it under general anesthesia.” He parried a question on why faith and emotion spurred humans to greater efforts – such as love and war – than reason ever could.

Such questions were straight down the line for AC Grayling, whose quote, ‘Religion and science have a common ancestor, ignorance’, features at the top of the quotes page of the Richard Dawkins Foundation website. At an earlier session in Jaipur, ‘In Defence of Enlightenment’, Grayling said, “Earlier, doubt about the existence of god was seen as a sin. This is what changed with the Enlightenment in the 18th century, which taught us scepticism; that humans can be fallible.”

“One of the best known examples of this divide between emotions and reason was Dr Spock of Star Trek. He thought he was a poor logician because he was always in love with someone or the other. It’s another matter that he was suspected to be in love with Captain Kirk,” Grayling told to a ripple of laughter. “But seriously, the rights of the individual are not in exclusion to his deep emotional bonds with family or society.”

Cognitive psychologist Steven Pinker, who was on the dais with Grayling, addressed the ‘faith-versus-reason’ question head-on. He told Hindustan Times, “Emotions can be very destructive. And reason even lets you choose the right belief.”

The last throw on the subject, however, was Dawkins’s. At the last session of the festival on Tuesday, a debate on whether man has replaced god, Dawkins said, “You are utterly wasting your time – all of you who are indignant at being attacked about your god – because there is no god.”

In Defense of Superstition


SUPERSTITION is typically a pejorative term. Belief in things like magic and miracles is thought to be irrational and scientifically retrograde. But as studies have repeatedly shown, some level of belief in the supernatural — often a subtle and unconscious belief — appears to be unavoidable, even among skeptics. One study found that a group of seemingly rational Princeton students nonetheless believed that they had influenced the Super Bowl just by watching it on TV. We are all mystics, to a degree.

The good news is that superstitious thought, or “magical thinking,” even as it misrepresents reality, has its advantages. It offers psychological benefits that logic and science can’t always provide: namely, a sense of control and a sense of meaning.

Consider one “law of magic” that people tend to put stock in: the idea that “luck is in your hands,” that you can affect your fate via superstitious rituals like knocking on wood or carrying a lucky charm. We often rely on such rituals when we are anxious or want to perform well, and though they may not directly have their intended magical effects, these rituals produce an illusion of control and enhance self-confidence, which in turn can improve our performance and thus indirectly affect our fate.

For instance, in one study led by the psychologist Lysann Damisch of the University of Cologne, subjects were handed a golf ball, and half of them were told that the ball had been lucky so far. Those subjects with a “lucky” ball drained 35 percent more golf putts than those with a “regular” ball. In another scenario, subjects performed better on memory and word games when armed with a lucky charm. In a more real-world example of this effect, the anthropologist Richard Sosis of the University of Connecticut found that in Israel during the second intifada in the early 2000s, 36 percent of secular women in the town of Tzfat recited psalms in response to the violence. Compared with those who did not recite psalms, he found, those women benefited from reduced anxiety: they felt more comfortable entering crowds, going shopping and riding buses — a result, he concluded, of their increased sense of control.

Another law of magic is “everything happens for a reason” — there is no such thing as randomness or happenstance. This is so-called teleological reasoning, which assumes intentions and goals behind even evidently purposeless entities like hurricanes. As social creatures, we may be biologically tuned to seek evidence of intentionality in the world, so that we can combat or collaborate with whoever did what’s been done. When lacking a visible author, we end up crediting an invisible one — God, karma, destiny, whatever.

This illusion, too, turns out to be psychologically useful. In research led by the psychologistLaura Kray of the University of California, Berkeley, subjects reflected on a turning point in their lives. The more they felt the turning point to have been fated, the more they believed, “It made me who I am today” and, “It gave meaning to my life.” Belief in destiny helps render your life a coherent narrative, which infuses your goals with a greater sense of purpose. This works even when those turning points are harmful: in a study led by the psychologist Kenneth Pargament of Bowling Green State University, students who saw a negative event as “part of God’s plan” showed more growth in its aftermath. They became more open to new perspectives, more intimate in their relationships and more persistent in overcoming challenges.

There are similar laws that govern other popular superstitions, including the belief that objects can carry the “essences” of previous owners (which explains why you might want to own a pen once used by a favorite writer); the belief that symbolic objects can summon what they represent (which explains why you’re scared to cut up a photograph of your mother); and the attribution of consciousness to inanimate objects (which explains why you yell at the laptop that deleted your files). In various ways they all emerge from basic habits of mind, and they all add structure and meaning to a chaotic and absurd universe.

Which isn’t to say magical thinking has no downside. At its worst, it can lead to obsession, fatalism or psychosis. But without it, the existential angst of realizing we’re just impermanent clusters of molecules with no ultimate purpose would overwhelm us.

So to believe in magic — as, on some deep level, we all do — does not make you stupid, ignorant or crazy. It makes you human.

Matthew Hutson is the author of the forthcoming book “The 7 Laws of Magical Thinking: How Irrational Beliefs Keep Us Happy, Healthy, and Sane.”

Learning to Respect Religion


A FEW years ago, God seemed caught in a devil of a fight.

Atheists were firing thunderbolts suggesting that “religion poisons everything,” as Christopher Hitchens put it in the subtitle of his book, “God Is Not Great.” Sam Harris and Richard Dawkins also wrote best sellers that were scathing about God, whom Dawkins denounced as “arguably the most unpleasant character in fiction.”

Yet lately I’ve noticed a very different intellectual tide: grudging admiration for religion as an ethical and cohesive force.

The standard-bearer of this line of thinking — and a provocative text for Easter Sunday — is a new book, “Religion for Atheists,” by Alain de Botton. He argues that atheists have a great deal to learn from religion.

“One can be left cold by the doctrines of the Christian Trinity and the Buddhist Eightfold Path and yet at the same time be interested in the ways in which religions deliver sermons, promote morality, engender a spirit of community, make use of art and architecture, inspire travels, train minds and encourage gratitude at the beauty of spring,” de Botton writes.

“The error of modern atheism has been to overlook how many aspects of the faiths remain relevant even after their central tenets have been dismissed,” he adds, and his book displays an attitude toward religion that is sometimes — dare I say — reverential.

Edward O. Wilson, the eminent Harvard biologist, has a new book, “The Social Conquest of Earth,” that criticizes religion as “stultifying and divisive” — but also argues that religion offered a competitive advantage to early societies. Faith bolstered social order among followers and helped bind a tribe together, he writes, and that is why religion is so widespread today. And he tips his hat to the social role of faith:

“Organized religions preside over the rites of passage, from birth to maturity, from marriage to death,” Wilson writes, adding: “Beliefs in immortality and ultimate divine justice give priceless comfort, and they steel resolution and bravery in difficult times. For millennia, organized religions have been the source of much of the best in the creative arts.”

Jonathan Haidt, a University of Virginia psychology professor, also focuses on the unifying power of faith in his new book, “The Righteous Mind.” Haidt, an atheist since his teens, argues that scientists often misunderstand religion because they home in on individuals rather than on the way faith can bind a community.

Haidt cites research showing that a fear of God may make a society more ethical and harmonious. For example, one study found that people were less likely to cheat if they were first given a puzzle that prompted thoughts of God.

Another study cited by Haidt found that of 200 communes founded in the 19th century, only 6 percent of the secular communes survived two decades, compared with 39 percent of the religious ones. Those that survived longest were those that demanded sacrifices of members, like fasting, daily prayer, abstaining from alcohol or tobacco, or adopting new forms of clothing or hairstyle.

“The very ritual practices that the New Atheists dismiss as costly, inefficient and irrational turn out to be a solution to one of the hardest problems humans face: cooperation without kinship,” Haidt writes.

The latest wave of respectful atheist writing strikes me as a healthy step toward nuance. I’ve reported on some of the worst of religion — such as smug, sanctimonious indifference among Christian fundamentalists at the toll of AIDS among gay men — yet I’ve also been awed by nuns and priests risking their lives in war zones. And many studies have found that religious people donate more money and volunteer more time to charity than the nonreligious. Let’s not answer religious fundamentalism with secular fundamentalism, religious intolerance with irreligious intolerance.

The new wave is skeptical but acknowledges stunning achievements, from Notre Dame Cathedral to networks of soup kitchens run by houses of worship across America. Maybe this new attitude can eventually be the basis for a truce in our religious wars, for a bridge across the “God gulf.” Let us pray …

The Taint of ‘Social Darwinism’

Given the well-known Republican antipathy to evolution, President Obama’s recent description of the Republican budget as an example of “social Darwinism” may be a canny piece of political labeling. In the interests of historical accuracy, however, it should be clearly recognized that “social Darwinism” has very little to do with the ideas developed by Charles Darwin in “On the Origin of Species.” Social Darwinism emerged as a movement in the late 19th-century, and has had waves of popularity ever since, but its central ideas owe more to the thought of a luminary of that time, Herbert Spencer, whose writings are (to understate) no longer widely read.

Spencer, who coined the phrase “survival of the fittest,” thought about natural selection on a grand scale. Conceiving selection in pre-Darwinian terms — as a ruthless process, “red in tooth and claw” — he viewed human culture and human societies as progressing through fierce competition. Provided that policymakers do not take foolish steps to protect the weak, those people and those human achievements that are fittest — most beautiful, noble, wise, creative, virtuous, and so forth — will succeed in a fierce competition, so that, over time, humanity and its accomplishments will continually improve. Late 19th-century dynastic capitalists, especially the American “robber barons,” found this vision profoundly congenial. Their contemporary successors like it for much the same reasons, just as some adolescents discover an inspiring reinforcement of their self-image in the writings of Ayn Rand .

Although social Darwinism has often been closely connected with ideas in eugenics (pampering the weak will lead to the “decline of the race”) and with theories of racial superiority (the economic and political dominance of people of North European extraction is a sign that some racial groups are intrinsically better than others), these are not central to the position.

The heart of social Darwinism is a pair of theses: first, people have intrinsic abilities and talents (and, correspondingly, intrinsic weaknesses), which will be expressed in their actions and achievements, independently of the social, economic and cultural environments in which they develop; second, intensifying competition enables the most talented to develop their potential to the full, and thereby to provide resources for a society that make life better for all. It is not entirely implausible to think that doctrines like these stand behind a vast swath of Republican proposals, including the recent budget, with its emphasis on providing greater economic benefits to the rich, transferring the burden to the middle-classes and poor, and especially in its proposals for reducing public services. Fuzzier versions of the theses have pervaded Republican rhetoric for the past decade (and even longer).

There are very good reasons to think both theses are false. Especially in the case of the Republican dynasties of our day, the Bushes and the Romneys, success has been facilitated by all kinds of social structures, by educational opportunities and legal restrictions, that were in place prior to and independently of their personal efforts or achievements. For those born into environments in which silver spoons rarely appear — Barack Obama, for instance — the contributions of the social environment are even more apparent. Without enormous support, access to inspiring teachers and skillful doctors, the backing of self-sacrificing relatives and a broader community, and without a fair bit of luck, the vast majority of people, not only in the United States but throughout the world, would never achieve the things of which they are, in principle, capable. In short, Horatio Alger needs lots of help, and a large thrust of contemporary Republican policy is dedicated to making sure he doesn’t get it.

Second, even if rigorous competition enables the talented — or, better, the lucky — to realize their goals, it is completely unwarranted to suppose that their accomplishments will translate into any increased benefit for the overwhelming majority of those who are less fortunate. The strenuous struggle social Darwinism envisages might select for something, but the most likely traits are a tendency to take whatever steps are necessary to achieve a foreseeable end, a sharp focus on narrowly individual goals and a corresponding disregard for others. We might reasonably expect that a world run on social Darwinist lines would generate a cadre of plutocrats, each resolutely concerned to establish a dynasty and to secure his favored branch of industry against future competition. In practical terms it would almost certainly yield a world in which the gap between rich and poor was even larger than it is now.

Rather than the beauty, wisdom, virtue and nobility Spencer envisioned arising from fierce competition, the likely products would be laws repealing inheritance taxes and deregulating profitable activities, and a vast population of people whose lives were even further diminished.

Yet, even if stimulating competition would achieve greater economic productivity, and even if this would, by some miraculous mechanism, yield a more egalitarian distribution of economic resources (presumably through the provision of more remunerative jobs), these welcome material benefits are not all that is needed. To quote a much-cited book, we do not “live by bread alone.” If the vast majority of citizens (or, globally, of people) are to enjoy any opportunities to develop the talents they have, they need the social structures social Darwinism perceives as pampering and counter-productive. Human well-being is profoundly affected by public goods, a concept that is entirely antithetical to social Darwinism or to contemporary Republican ideology, with their mythical citizens who can fulfill their potential without rich systems of social support. It is a callous fiction to suppose that what is needed is less investment in education, health care, public transportation and affordable public housing.

So long as social Darwinism is disentangled from the ancillary eugenic and racist ideas, so long as it is viewed in its core form of the two theses about the glories of competition, the label President Obama pinned on the Republican budget is completely deserved. Because the central ideas of social Darwinism are equally false and noxious, a commitment to truth in advertising should welcome the label. And all of us, including President Obama and the many people whose less spectacular successes have been enabled by social structures and public goods, should hope that the name leads Darwin-hating conservatives to worry about the Republican budget.

Philip Kitcher

Philip Kitcher is John Dewey Professor of Philosophy at Columbia University. He has written on topics in many fields of philosophy, including the history and philosophy of biology. Among his books are “Living with Darwin,” and, most recently, “The Ethical Project” and “Science in a Democratic Society.”

By Richard Dawkins

Afterword by Richard Dawkins.pdf Afterword by Richard Dawkins.pdf
89K   View   Download

Nothing expands the mind like the expanding universe. The music of the spheres is a nursery rhyme, a jingle to set against the majestic chords of the Symphonie Galactica. Changing the metaphor and the dimension, the dusts of centuries, the mists of what we presume to call “ancient” history, are soon blown off by the steady, eroding winds of geological ages. Even the age of the universe, accurate—so Lawrence Krauss assures us—to the fourth signi!cant !gure at 13.72 billion years, is dwarfed by the trillennia that are to come.

But Krauss’s vision of the cosmology of the remote future is paradoxical and frightening. Scienti!c progress is likely to go into reverse. We naturally think that, if there are cosmologists in the year 2 trillion “#, their vision of the universe will be expanded over ours. Not so—and this is one of the many shattering conclusions I take away on closing this book. Give or take a few billion years, ours is a very propitious time to be a cosmologist. Two trillion years hence, the universe will have expanded so far that all galaxies but the cosmologist’s own (whichever one it happens to be) will have receded behind an Einsteinian horizon so absolute, so inviolable, that they are not only invisible but beyond all possibility of leaving a trace, however indirect. They might as well never have existed. Every trace of the Big Bang will most likely have gone, forever and beyond recovery. The cosmologists of the future will be cut off from their past, and from their situation, in a way that we are not……………………………………..
Read more

Afterword by Richard Dawkins.pdf Afterword by Richard Dawkins.pdf
89K   View   Download


NASA: Two Earth-Size Planets Are Discovered

Posted: December 20, 2011 by PaanLuel Wël in Education, Philosophy, Science

We are like ants running around on an apple, speculating what the world is like. We don’t have enough perspective to even make good guesses–Mike in Chicago

University of Toulouse, via Agence France-Presse — Getty Images

An illustration of two Earth-sized planets orbiting a Sun-like star.


In what amounts to a kind of holiday gift to the cosmos, astronomers from NASA’s Kepler spacecraft announced Tuesday that they had discovered a pair of planets the size of Earth orbiting a distant star. The new planets, one about as big as Earth and the other slightly smaller than Venus, are the smallest yet found beyond the solar system.

Astronomers said the discovery showed that Kepler could indeed find planets as small as our own and was an encouraging sign that planet hunters would someday succeed in the goal of finding Earth-like abodes in the heavens.

Since the first Jupiter-size exoplanets, as they are known, were discovered nearly 15 years ago, astronomers have been chipping away at the sky, finding smaller and smaller planets.

“We are finally there,” said David Charbonneau, an astronomer at the Harvard-Smithsonian Center for Astrophysics, who was a member of the team that made the observations, led by his colleague Francois Fressin. The team reported its results in an online news conference Tuesday and in a paper being published in the journal Nature.

Dr. Fressin said, “This demonstrates for the first time that Earth-size planets exist around other stars and that we can detect them.”

The announcement doubled the number of known Earth-size planets in the galaxy to four from two — Earth and Venus.

The next major goal in the planetary hunt, astronomers say, is to find an Earth-size planet in the so-called Goldilocks zone of a star, where conditions are temperate for water and thus life. We are not there yet.

The two new planets, Kepler 20e and Kepler 20f, are far outside the Goldilocks zone — so close to the star, termed Kepler 20, that one of them is roasting at up to 1,400 degrees Fahrenheit — and thus unlivable.

Although the milestone of an Earth-size planet had long been anticipated, astronomers on and off the Kepler team were jubilant. Geoffrey Marcy of the University of California, Berkeley, another Kepler team member, called the new result “a watershed moment in human history.”

Debra Fischer, a planet hunter from Yale, who was not part of the team, said, “This technological feat is incredibly important because it means that the detection of Earth-size planets at larger distances is technically possible.”

Kepler 20e, the closer and hotter planet, is also the smaller — about 6,900 miles across, or slightly smaller than Venus — and it resides about 5 million miles from its star. The more distant planet, Kepler 20f, also broiling at around 800 degrees, is 10 million miles out from its star. It is 8,200 miles in diameter, about the size of Earth. The two planets are presumed to be rocky orbs that formed in the outskirts of their planetary system and then migrated inward.

Their star, which is slightly smaller and cooler than the Sun, is about 950 light years away from us. Kepler had previously found three larger Neptune-like planets around it, so the new observations bring the total to five so far. All the planets are well inside where Mercury would be in our own solar system, presenting a bounteous system of unlivable planets.

“This is Venus and Earth in a five-planet system,” Dr. Fischer said in an e-mail. “There’s no place like home, and the Kepler data are starting to uncover some mighty familiar architectures.”

Kepler detects planets by watching for blinks when they move in front of their stars. Since it was launched in 2009, it has found 2,326 potential planets, 207 that would be Earth-size, if confirmed as the two reported Tuesday have been.

Confirmation of a planet, however, requires additional observations, usually of its star’s wobbles as it gets tugged by the planet going around. The gravitational pull of planets as small as the Earth on their parent star is too small to measure with the current spectrographs. And so the astronomers resorted to a statistical method called Blender, developed by Dr. Fressin and Guillermo Torres of the Harvard-Smithsonian Center, in which millions of computer simulations of background stars try to mimic the Kepler signal. They concluded that Kepler 20e was 3,400 times more likely to be a planet than background noise, while the odds in favor Kepler 20f being real were 1,370 to 1.

Confirmed (or validated, as the Kepler team likes to say), they join the other planets already known to orbit the star. In a surprise for astronomers who thought they knew how planetary systems form, the orbits of the new planets are sandwiched between the orbits of the older, bigger, gassier ones, a configuration that does not occur in our own solar system.

In an e-mail, Dr. Charbonneau noted: “In the solar system, rocky worlds and gas giants don’t mingle. But in the Kepler 20 system they apparently do.”

The accidental universe: Science’s crisis of faith

By Alan P. Lightman

Alan Lightman, a physicist and novelist, teaches at MIT. His new book, Mr g: A Novel About the Creation, will be published in January by Pantheon.

In the fifth century B.C., the philosopher Democritus proposed that all matter was made of tiny and indivisible atoms, which came in various sizes and textures—some hard and some soft, some smooth and some thorny. The atoms themselves were taken as givens. In the nineteenth century, scientists discovered that the chemical properties of atoms repeat periodically (and created the periodic table to reflect this fact), but the origins of such patterns remained mysterious. It wasn’t until the twentieth century that scientists learned that the properties of an atom are determined by the number and placement of its electrons, the subatomic particles that orbit its nucleus. And we now know that all atoms heavier than helium were created in the nuclear furnaces of stars.

The history of science can be viewed as the recasting of phenomena that were once thought to be accidents as phenomena that can be understood in terms of fundamental causes and principles. One can add to the list of the fully explained: the hue of the sky, the orbits of planets, the angle of the wake of a boat moving through a lake, the six-sided patterns of snowflakes, the weight of a flying bustard, the temperature of boiling water, the size of raindrops, the circular shape of the sun. All these phenomena and many more, once thought to have been fixed at the beginning of time or to be the result of random events thereafter, have been explained as necessary consequences of the fundamental laws of nature—laws discovered by human beings.

This long and appealing trend may be coming to an end. Dramatic developments in cosmological findings and thought have led some of the world’s premier physicists to propose that our universe is only one of an enormous number of universes with wildly varying properties, and that some of the most basic features of our particular universe are indeed mere accidents—a random throw of the cosmic dice. In which case, there is no hope of ever explaining our universe’s features in terms of fundamental causes and principles.

It is perhaps impossible to say how far apart the different universes may be, or whether they exist simultaneously in time. Some may have stars and galaxies like ours. Some may not. Some may be finite in size. Some may be infinite. Physicists call the totality of universes the “multiverse.” Alan Guth, a pioneer in cosmological thought, says that “the multiple-universe idea severely limits our hopes to understand the world from fundamental principles.” And the philosophical ethos of science is torn from its roots. As put to me recently by Nobel Prize–winning physicist Steven Weinberg, a man as careful in his words as in his mathematical calculations, “We now find ourselves at a historic fork in the road we travel to understand the laws of nature. If the multiverse idea is correct, the style of fundamental physics will be radically changed.”

The scientists most distressed by Weinberg’s “fork in the road” are theoretical physicists. Theoretical physics is the deepest and purest branch of science. It is the outpost of science closest to philosophy, and religion. Experimental scientists occupy themselves with observing and measuring the cosmos, finding out what stuff exists, no matter how strange that stuff may be. Theoretical physicists, on the other hand, are not satisfied with observing the universe. They want to know why. They want to explain all the properties of the universe in terms of a few fundamental principles and parameters. These fundamental principles, in turn, lead to the “laws of nature,” which govern the behavior of all matter and energy. An example of a fundamental principle in physics, first proposed by Galileo in 1632 and extended by Einstein in 1905, is the following: All observers traveling at constant velocity relative to one another should witness identical laws of nature. From this principle, Einstein derived his theory of special relativity. An example of a fundamental parameter is the mass of an electron, considered one of the two dozen or so “elementary” particles of nature. As far as physicists are concerned, the fewer the fundamental principles and parameters, the better. The underlying hope and belief of this enterprise has always been that these basic principles are so restrictive that only one, self-consistent universe is possible, like a crossword puzzle with only one solution. That one universe would be, of course, the universe we live in. Theoretical physicists are Platonists. Until the past few years, they agreed that the entire universe, the one universe, is generated from a few mathematical truths and principles of symmetry, perhaps throwing in a handful of parameters like the mass of the electron. It seemed that we were closing in on a vision of our universe in which everything could be calculated, predicted, and understood.

However, two theories in physics, eternal inflation and string theory, now suggest that the same fundamental principles from which the laws of nature derive may lead to many different self-consistent universes, with many different properties. It is as if you walked into a shoe store, had your feet measured, and found that a size 5 would fit you, a size 8 would also fit, and a size 12 would fit equally well. Such wishy-washy results make theoretical physicists extremely unhappy. Evidently, the fundamental laws of nature do not pin down a single and unique universe. According to the current thinking of many physicists, we are living in one of a vast number of universes. We are living in an accidental universe. We are living in a universe uncalculable by science.

“Back in the 1970s and 1980s,” says Alan Guth, “the feeling was that we were so smart, we almost had everything figured out.” What physicists had figured out were very accurate theories of three of the four fundamental forces of nature: the strong nuclear force that binds atomic nuclei together, the weak force that is responsible for some forms of radioactive decay, and the electromagnetic force between electrically charged particles. And there were prospects for merging the theory known as quantum physics with Einstein’s theory of the fourth force, gravity, and thus pulling all of them into the fold of what physicists called the Theory of Everything, or the Final Theory. These theories of the 1970s and 1980s required the specification of a couple dozen parameters corresponding to the masses of the elementary particles, and another half dozen or so parameters corresponding to the strengths of the fundamental forces. The next step would then have been to derive most of the elementary particle masses in terms of one or two fundamental masses and define the strengths of all the fundamental forces in terms of a single fundamental force.

There were good reasons to think that physicists were poised to take this next step. Indeed, since the time of Galileo, physics has been extremely successful in discovering principles and laws that have fewer and fewer free parameters and that are also in close agreement with the observed facts of the world. For example, the observed rotation of the ellipse of the orbit of Mercury, 0.012 degrees per century, was successfully calculated using the theory of general relativity, and the observed magnetic strength of an electron, 2.002319 magnetons, was derived using the theory of quantum electrodynamics. More than any other science, physics brims with highly accurate agreements between theory and experiment.

Guth started his physics career in this sunny scientific world. Now sixty-four years old and a professor at MIT, he was in his early thirties when he proposed a major revision to the Big Bang theory, something called inflation. We now have a great deal of evidence suggesting that our universe began as a nugget of extremely high density and temperature about 14 billion years ago and has been expanding, thinning out, and cooling ever since. The theory of inflation proposes that when our universe was only about a trillionth of a trillionth of a trillionth of a second old, a peculiar type of energy caused the cosmos to expand very rapidly. A tiny fraction of a second later, the universe returned to the more leisurely rate of expansion of the standard Big Bang model. Inflation solved a number of outstanding problems in cosmology, such as why the universe appears so homogeneous on large scales.

When I visited Guth in his third-floor office at MIT one cool day in May, I could barely see him above the stacks of paper and empty Diet Coke bottles on his desk. More piles of paper and dozens of magazines littered the floor. In fact, a few years ago Guth won a contest sponsored by the Boston Globe for the messiest office in the city. The prize was the services of a professional organizer for one day. “She was actually more a nuisance than a help. She took piles of envelopes from the floor and began sorting them according to size.” He wears aviator-style eyeglasses, keeps his hair long, and chain-drinks Diet Cokes. “The reason I went into theoretical physics,” Guth tells me, “is that I liked the idea that we could understand everything—i.e., the universe—in terms of mathematics and logic.” He gives a bitter laugh. We have been talking about the multiverse.

While challenging the Platonic dream of theoretical physicists, the multiverse idea does explain one aspect of our universe that has unsettled some scientists for years: according to various calculations, if the values of some of the fundamental parameters of our universe were a little larger or a little smaller, life could not have arisen. For example, if the nuclear force were a few percentage points stronger than it actually is, then all the hydrogen atoms in the infant universe would have fused with other hydrogen atoms to make helium, and there would be no hydrogen left. No hydrogen means no water. Although we are far from certain about what conditions are necessary for life, most biologists believe that water is necessary. On the other hand, if the nuclear force were substantially weaker than what it actually is, then the complex atoms needed for biology could not hold together. As another example, if the relationship between the strengths of the gravitational force and the electromagnetic force were not close to what it is, then the cosmos would not harbor any stars that explode and spew out life-supporting chemical elements into space or any other stars that form planets. Both kinds of stars are required for the emergence of life. The strengths of the basic forces and certain other fundamental parameters in our universe appear to be “fine-tuned” to allow the existence of life. The recognition of this fine­tuning led British physicist Brandon Carter to articulate what he called the anthropic principle, which states that the universe must have the parameters it does because we are here to observe it. Actually, the word anthropic, from the Greek for “man,” is a misnomer: if these fundamental parameters were much different from what they are, it is not only human beings who would not exist. No life of any kind would exist.

If such conclusions are correct, the great question, of course, is why these fundamental parameters happen to lie within the range needed for life. Does the universe care about life? Intelligent design is one answer. Indeed, a fair number of theologians, philosophers, and even some scientists have used fine-tuning and the anthropic principle as evidence of the existence of God. For example, at the 2011 Christian Scholars’ Conference at Pepperdine University, Francis Collins, a leading geneticist and director of the National Institutes of Health, said, “To get our universe, with all of its potential for complexities or any kind of potential for any kind of life-form, everything has to be precisely defined on this knife edge of improbability…. [Y]ou have to see the hands of a creator who set the parameters to be just so because the creator was interested in something a little more complicated than random particles.”

Intelligent design, however, is an answer to fine-tuning that does not appeal to most scientists. The multiverse offers another explanation. If there are countless different universes with different properties—for example, some with nuclear forces much stronger than in our universe and some with nuclear forces much weaker—then some of those universes will allow the emergence of life and some will not. Some of those universes will be dead, lifeless hulks of matter and energy, and others will permit the emergence of cells, plants and animals, minds. From the huge range of possible universes predicted by the theories, the fraction of universes with life is undoubtedly small. But that doesn’t matter. We live in one of the universes that permits life because otherwise we wouldn’t be here to ask the question.

The explanation is similar to the explanation of why we happen to live on a planet that has so many nice things for our comfortable existence: oxygen, water, a temperature between the freezing and boiling points of water, and so on. Is this happy coincidence just good luck, or an act of Providence, or what? No, it is simply that we could not live on planets without such properties. Many other planets exist that are not so hospitable to life, such as Uranus, where the temperature is –371 degrees Fahrenheit, and Venus, where it rains sulfuric acid.

The multiverse offers an explanation to the fine-tuning conundrum that does not require the presence of a Designer. As Steven Weinberg says: “Over many centuries science has weakened the hold of religion, not by disproving the existence of God but by invalidating arguments for God based on what we observe in the natural world. The multiverse idea offers an explanation of why we find ourselves in a universe favorable to life that does not rely on the benevolence of a creator, and so if correct will leave still less support for religion.”

Some physicists remain skeptical of the anthropic principle and the reliance on multiple universes to explain the values of the fundamental parameters of physics. Others, such as Weinberg and Guth, have reluctantly accepted the anthropic principle and the multiverse idea as together providing the best possible explanation for the observed facts.

If the multiverse idea is correct, then the historic mission of physics to explain all the properties of our universe in terms of fundamental principles—to explain why the properties of our universe must necessarily be what they are—is futile, a beautiful philosophical dream that simply isn’t true. Our universe is what it is because we are here. The situation could be likened to a school of intelligent fish who one day began wondering why their world is completely filled with water. Many of the fish, the theorists, hope to prove that the entire cosmos necessarily has to be filled with water. For years, they put their minds to the task but can never quite seem to prove their assertion. Then, a wizened group of fish postulates that maybe they are fooling themselves. Maybe there are, they suggest, many other worlds, some of them completely dry, and everything in between.

The most striking example of fine-tuning, and one that practically demands the multiverse to explain it, is the unexpected detection of what scientists call dark energy. Little more than a decade ago, using robotic telescopes in Arizona, Chile, Hawaii, and outer space that can comb through nearly a million galaxies a night, astronomers discovered that the expansion of the universe is accelerating. As mentioned previously, it has been known since the late 1920s that the universe is expanding; it’s a central feature of the Big Bang model. Orthodox cosmological thought held that the expansion is slowing down. After all, gravity is an attractive force; it pulls masses closer together. So it was quite a surprise in 1998 when two teams of astronomers announced that some unknown force appears to be jamming its foot down on the cosmic accelerator pedal. The expansion is speeding up. Galaxies are flying away from each other as if repelled by antigravity. Says Robert Kirshner, one of the team members who made the discovery: “This is not your father’s universe.” (In October, members of both teams were awarded the Nobel Prize in Physics.)

Physicists have named the energy associated with this cosmological force dark energy. No one knows what it is. Not only invisible, dark energy apparently hides out in empty space. Yet, based on our observations of the accelerating rate of expansion, dark energy constitutes a whopping three quarters of the total energy of the universe. It is the invisible elephant in the room of science.

The amount of dark energy, or more precisely the amount of dark energy in every cubic centimeter of space, has been calculated to be about one hundred-millionth (10–8) of an erg per cubic centimeter. (For comparison, a penny dropped from waist-high hits the floor with an energy of about three hundred thousand—that is, 3 × 105—ergs.) This may not seem like much, but it adds up in the vast volumes of outer space. Astronomers were able to determine this number by measuring the rate of expansion of the universe at different epochs—if the universe is accelerating, then its rate of expansion was slower in the past. From the amount of acceleration, astronomers can calculate the amount of dark energy in the universe.

Theoretical physicists have several hypotheses about the identity of dark energy. It may be the energy of ghostly subatomic particles that can briefly appear out of nothing before self­annihilating and slipping back into the vacuum. According to quantum physics, empty space is a pandemonium of subatomic particles rushing about and then vanishing before they can be seen. Dark energy may also be associated with an as-yet-unobserved force field called the Higgs field, which is sometimes invoked to explain why certain kinds of matter have mass. (Theoretical physicists ponder things that other people do not.) And in the models proposed by string theory, dark energy may be associated with the way in which extra dimensions of space—beyond the usual length, width, and breadth—get compressed down to sizes much smaller than atoms, so that we do not notice them.

These various hypotheses give a fantastically large range for the theoretically possible amounts of dark energy in a universe, from something like 10115 ergs per cubic centimeter to –10115 ergs per cubic centimeter. (A negative value for dark energy would mean that it acts to decelerate the universe, in contrast to what is observed.) Thus, in absolute magnitude, the amount of dark energy actually present in our universe is either very, very small or very, very large compared with what it could be. This fact alone is surprising. If the theoretically possible positive values for dark energy were marked out on a ruler stretching from here to the sun, with zero at one end of the ruler and 10115 ergs per cubic centimeter at the other end, the value of dark energy actually found in our universe (10–8 ergs per cubic centimeter) would be closer to the zero end than the width of an atom.

On one thing most physicists agree: If the amount of dark energy in our universe were only a little bit different than what it actually is, then life could never have emerged. A little more and the universe would accelerate so rapidly that the matter in the young cosmos could never pull itself together to form stars and thence form the complex atoms made in stars. And, going into negative values of dark energy, a little less and the universe would decelerate so rapidly that it would recollapse before there was time to form even the simplest atoms.

Here we have a clear example of fine-tuning: out of all the possible amounts of dark energy that our universe might have, the actual amount lies in the tiny sliver of the range that allows life. There is little argument on this point. It does not depend on assumptions about whether we need liquid water for life or oxygen or particular biochemistries. As before, one is compelled to ask the question: Why does such fine-tuning occur? And the answer many physicists now believe: The multiverse. A vast number of universes may exist, with many different values of the amount of dark energy. Our particular universe is one of the universes with a small value, permitting the emergence of life. We are here, so our universe must be such a universe. We are an accident. From the cosmic lottery hat containing zillions of universes, we happened to draw a universe that allowed life. But then again, if we had not drawn such a ticket, we would not be here to ponder the odds.

The concept of the multiverse is compelling not only because it explains the problem of fine-tuning. As I mentioned earlier, the possibility of the multiverse is actually predicted by modern theories of physics. One such theory, called eternal inflation, is a revision of Guth’s inflation theory developed by Andrei Linde, Paul Steinhardt, and Alex Vilenkin in the early and mid-1980s. In regular inflation theory, the very rapid expansion of the infant universe is caused by an energy field, like dark energy, that is temporarily trapped in a condition that does not represent the lowest possible energy for the universe as a whole—like a marble sitting in a small dent on a table. The marble can stay there, but if it is jostled it will roll out of the dent, roll across the table, and then fall to the floor (which represents the lowest possible energy level). In the theory of eternal inflation, the dark energy field has many different values at different points of space, analogous to lots of marbles sitting in lots of dents on the cosmic table. Moreover, as space expands rapidly, the number of marbles increases. Each of these marbles is jostled by the random processes inherent in quantum mechanics, and some of the marbles will begin rolling across the table and onto the floor. Each marble starts a new Big Bang, essentially a new universe. Thus, the original, rapidly expanding universe spawns a multitude of new universes, in a never-ending process.

String theory, too, predicts the possibility of the multiverse. Originally conceived in the late 1960s as a theory of the strong nuclear force but soon enlarged far beyond that ambition, string theory postulates that the smallest constituents of matter are not subatomic particles like the electron but extremely tiny one-dimensional “strings” of energy. These elemental strings can vibrate at different frequencies, like the strings of a violin, and the different modes of vibration correspond to different fundamental particles and forces. String theories typically require seven dimensions of space in addition to the usual three, which are compacted down to such small sizes that we never experience them, like a three-dimensional garden hose that appears as a one-dimensional line when seen from a great distance. There are, in fact, a vast number of ways that the extra dimensions in string theory can be folded up, and each of the different ways corresponds to a different universe with different physical properties.

It was originally hoped that from a theory of these strings, with very few additional parameters, physicists would be able to explain all the forces and particles of nature—all of reality would be a manifestation of the vibrations of elemental strings. String theory would then be the ultimate realization of the Platonic ideal of a fully explicable cosmos. In the past few years, however, physicists have discovered that string theory predicts not a unique universe but a huge number of possible universes with different properties. It has been estimated that the “string landscape” contains 10500 different possible universes. For all practical purposes, that number is infinite.

It is important to point out that neither eternal inflation nor string theory has anywhere near the experimental support of many previous theories in physics, such as special relativity or quantum electrodynamics, mentioned earlier. Eternal inflation or string theory, or both, could turn out to be wrong. However, some of the world’s leading physicists have devoted their careers to the study of these two theories.

Back to the intelligent fish. The wizened old fish conjecture that there are many other worlds, some with dry land and some with water. Some of the fish grudgingly accept this explanation. Some feel relieved. Some feel like their lifelong ruminations have been pointless. And some remain deeply concerned. Because there is no way they can prove this conjecture. That same uncertainty disturbs many physicists who are adjusting to the idea of the multiverse. Not only must we accept that basic properties of our universe are accidental and uncalculable. In addition, we must believe in the existence of many other universes. But we have no conceivable way of observing these other universes and cannot prove their existence. Thus, to explain what we see in the world and in our mental deductions, we must believe in what we cannot prove.

Sound familiar? Theologians are accustomed to taking some beliefs on faith. Scientists are not. All we can do is hope that the same theories that predict the multiverse also produce many other predictions that we can test here in our own universe. But the other universes themselves will almost certainly remain a conjecture.

“We had a lot more confidence in our intuition before the discovery of dark energy and the multiverse idea,” says Guth. “There will still be a lot for us to understand, but we will miss out on the fun of figuring everything out from first principles.”

One wonders whether a young Alan Guth, considering a career in science today, would choose theoretical physics.

Stephen Hawking: Space Exploration Crucial To Human Survival

Posted: November 20, 2011 by PaanLuel Wël in Science

TORONTO – Stephen Hawking says the colonization of outer space is key to the survival of humankind, predicting it will be difficult for the world’s inhabitants “to avoid disaster in the next hundred years.”


The renowned astrophysicist explores some of the most remarkable advancements in technology and health with the new U.K.-Canadian series “Brave New World With Stephen Hawking,” debuting Saturday on Discovery World HD.

Before its premiere, he discussed the earth’s most pressing concerns in an email interview with The Canadian Press from Cambridge, England, declaring space exploration to be humankind’s most urgent mission.

“We are entering an increasingly dangerous period of our history,” said Hawking, who has Lou Gehrig’s disease, leaving him almost completely paralyzed and unable to speak.

“Our population and our use of the finite resources of planet Earth are growing exponentially, along with our technical ability to change the environment for good or ill. But our genetic code still carries the selfish and aggressive instincts that were of survival advantage in the past. It will be difficult enough to avoid disaster in the next hundred years, let alone the next thousand or million.

“Our only chance of long-term survival is not to remain lurking on planet Earth, but to spread out into space.”

Hawking said this is why he favours manned — or as he puts it, “personed” — space flight and encourages further study into how to make space colonization possible.

Hawking’s five-part TV series touches on that theme, while putting the spotlight on scientific breakthroughs that promise to transform the 21st century. He introduces each episode while a team of experts travel the globe to delve deeper into various innovations.

The experts themselves represent a wide range of disciplines — they include naturalist Sir David Attenborough, author and evolutionary biologist Richard Dawkins, biologist and broadcaster Aarathi Prasad, and Canadian astronaut and neurologist Roberta Bondar.

More Canadian content comes by way of a segment set at the SNOLAB in Sudbury, Ont., an underground science lab specializing in neutrino and dark matter physics.

By email, Hawking says he’s excited by work underway at the Perimeter Institute in Waterloo, Ont., which he visited in June 2010 and was named its first distinguished research chair.

“Perimeter is a grand experiment in theoretical physics and the institute’s twin focus, on quantum theory and gravity, is very close to my heart and central to explaining the origin of the universe,” said Hawking, also director of research at the Centre for Theoretical Cosmology at Cambridge University.

“I am hoping, and expecting, great things will happen there. And I hope to visit again soon.”

In September, the institute expanded with a new wing called the Stephen Hawking Centre but the cosmologist was unable to attend in person and sent his regards by video.

Marvels featured in his new TV series include a computer in Switzerland that is powered by the brain, a driverless car that is smart enough to navigate the crooked streets of San Francisco and a baby-like robot in Italy that learns like a child.

Later episodes investigate the way brain disorders could be treated using laser light and genetically modified brain cells, how mobile phones can give experts access to our every habit and action and lasers that print objects in 3D.

“I have so much I want to do,” Hawking says of his boundless curiosity about the world. “There are so many questions still to answer.”

“Brave New World With Stephen Hawking” debuts Saturday on Discovery World HD.

So, we need to get off the earth, like yesterday. What are the options?

U.K. Pledges $31 Million to Help Wipe Out Guinea Worm Disease

Posted: October 7, 2011 by PaanLuel Wël in Science

By Betsy McKay

The British government has pledged about $31 million to help eradicate guinea worm disease, a donation that public-health experts say will bring them close to finishing the job.

A quarter century ago, the crippling parasitic infection afflicted 3.5 million people a year in more than 20 countries. This year, there are expected to be just over 1,000 cases in four African countries. More than 98% of those cases are in South Sudan, with a few dozen in Ethiopia, Mali, and Chad.

Guinea worm disease is passed along when people drink water from sources containing water fleas that harbor guinea worm larvae. Once inside a human, the larvae spawn worms that can reach three feet in length. The worms incubate for a year and then emerge slowly through painful lesions. When people soak their lesion-covered limbs in water, the worms release larvae, starting the cycle all over again.

The 25-year-long push to eradicate guinea worm is championed by former U.S. President Jimmy Carter, whose Carter Center in Atlanta has led the effort. The donation from the U.K. Department for International Development will be made over four years to the Carter Center.

According to the center, the best way to eliminate the disease is to “prevent people from entering sources of drinking water with an emerging guinea worm and to educate households to always use household or pipe filters to sieve out tiny water fleas carrying infective larvae.”

Donald Hopkins, vice president for health programs for the Carter Center, said $275 million, donated by several governments, has been spent so far wiping out the disease. The U.K. donation will go toward the $75 million the Carter Center estimates is needed to get the job done and to verify eradication.

“We’re very close,” says Hopkins, who has been working on guinea worm eradication since 1980. “This is going to happen. I can’t predict when, but it will be soon.” The Carter Center’s goal is to break the cycle of disease transmission in South Sudan next year, with no cases reported in 2013, he says. It would take three years of no cases to certify that the disease has been wiped out.

The donation comes as the U.K. is growing foreign-aid donations while implementing belt-tightening elsewhere, said Annabelle Malins, British Consul General in Atlanta. “We hope this will be a major tipping point to provide for the full funding requirement” for guinea worm eradication, she said.

Guinea worm disease would be the second human disease to be eradicated after smallpox, and the first to be wiped out without a vaccine or medical treatment. The disease hurts local agriculture in particular as it cripples workers temporarily during planting or harvests.

Image: Associated Press

UK gives £20m to global war on guinea worm

By Charlie Cooper

Thursday, 6 October 2011

The fight to eradicate the gruesome and debilitating “guinea worm” disease, making it only the second in the world to be wiped out after smallpox, is on the verge of success after it secured £20m funding from the Government.

Guinea worm afflicted 3.5 million people across 21 countries in 1986, but thanks to a campaign launched that year by former US President, Jimmy Carter, it is now confined to South Sudan, Ethiopia and Mali, afflicting only 1,797 people last year.

The disease is contracted by drinking water contaminated with microscopic worm larvae, which grow up to a metre long and emerge about a year later from the afflicted person’s body through a blister in the skin.

Britain has now become the first state donor to fund the campaign, which could exacerbate the wrath of many on the right of the Conservative Party, who have privately expressed concern that the Government is spending too much on foreign aid. There is no known cure or vaccine but aid efforts have focused on providing drinking water filters and educating vulnerable populations about the dangers of drinking contaminated water.

The disease is usually non-fatal but causes extreme pain and leaves sufferers bedridden for weeks or months. If the eradication drive is successful, it will follow smallpox into history and the species that causes it will be declared extinct.

Jimmy Carter paid tribute to the UK’s “willingness and staying power” in supporting his campaign, which hopes to achieve its goal by 2015, and called on other donors to “match the UK’s efforts”. The funding pledge from the Department for International Development (DFID) is dependent on other donors providing the additional £40m needed to achieve the Carter Foundation’s goals.

Dr John Hardman, president of the Carter Foundation, praised the DFID for leading the developed world on international aid.

“We have had a strong partnership with DFID for years and to hear about this additional grant was music to our ears,” he said. “DFID exemplify how we can form partnerships to attack challenging problems and diseases in the developing world.”

The disease by numbers

99.95% The fall in sufferers from guinea worm disease over the past 25 years.

£60m The total amount of money the Carter Centre believes is needed to eradicate the disease forever.

£950m The amount of the DFID’s annual £8.1bn budget spent on health projects.

Jimmy Carter asks for cash to wipe out guinea worm


Former U.S. President Jimmy Carter is appealing for other donors to join Britain in a multi-million dollar campaign to wipe out guinea worm, a crippling and painful parasitic disease that now exists only in four African countries.

At a press briefing in London on Wednesday, British officials are expected to pledge 20 million pounds (US$31 million) over four years to the cause — but only if other donors also open their wallets.

The global campaign to eradicate guinea worm started in 1980, when there were about 3.5 million cases of the disease, also known as dracunculiasis, every year across Africa and Asia.

Since then, cases have dropped by more than 99 percent, but the disease remains a problem in South Sudan, Ethiopia, Mali and Chad. Last year, there were 1,797 cases.

The Carter Center and partners, including the World Health Organization and the U.S. Centers for Disease Control and Prevention, aim to get rid of guinea worm disease by 2015.

There is no treatment or cure; the disease is eliminated by stopping people from drinking dirty water and by preventing infected people from wading into water and spreading the disease. Health campaigns that focus on changing behavior are often more difficult to implement than those that rely on medicines or vaccines.

Smallpox is the only disease in history to have been eradicated, while another effort to get rid of polio is also ongoing.

People get infected with guinea worm when they drink water infected with the larvae of the parasite.

About a year after someone is infected, the spaghetti-like worm, which can grow up to 1 meter in length, bursts out of their foot. That painful process can take months, often leaves the patient bedridden, and involves winding the worm around a stick so it doesn’t break.

Guinea worm disease “prevents people from escaping poverty,” Carter said in a statement. “I welcome the challenge laid down by the British government. I call on other donors to match their efforts.”

Efforts to end worm disease get British boost
October 5th, 2011
01:49 PM ET

Britain will back a final push to wipe out a debilitating parasitic worm disease that is on the verge of worldwide eradication.

Former President Jimmy Carter, World Health Organization’s director-general Margaret Chan and British officials in London, announced Wednesday a new campaign to rid the world of the Guinea worm, making it the second disease to be eradicated.

The British government pledged about $30 million in eradication efforts. International Development Minister Stephen O’Brien and Carter emphasized the need for donors to match the funds to get rid of the guinea worm.

“The eradication of guinea worm is within our sights,” O’Brien said.  “But it does still remain unfinished business, mainly for the poorest people in remote regions of the remaining four endemic countries where the worm persists.”

The first disease to be wiped off the earth was smallpox, which was eliminated through vaccines.

Unlike smallpox, the Guinea worm disease is not fatal. But there is no treatment for it and there’s no vaccine to prevent infection either, according t to the Centers for Disease Control and Prevention.  This disease can, however, cause permanent disabilities to people, crippling their livelihood and local economies.

The key to eradicating the disease is access to clean water and changes in people’s behavior because the parasitic Guinea worm lives in stagnant water.  When a person drinks the contaminated water, the worm grows inside its human host for a year until it emerges through the skin, causing great pain and in some cases, infections. The worm has been described in the Bible and Ancient Egyptian and Greek texts.

Graphic: How the guinea worm infects a person

Today, the worm is far less pervasive.  Statistics from 2010 show that 1,797 cases remain in the world, in four countries: Ethiopia, Mali, Chad and mostly South Sudan.

“For most of the world, this is an invisible worm – out of sight, out of mind, because it affects the poorest of the poor, people living in remote, rural areas,” said Chan from the WHO.

Carter commended the British government for “its willingness and staying power to help eradicate this debilitating disease,” and called on donors to match their efforts.  The goal is to stop the transmission of the guinea worm before 2015.

Unlike diseases like HIV/AIDS, tuberculosis and malaria, guinea worm is a little known disease.  The Carter Center, based in Atlanta, Georgia, has led public health efforts tackling neglected diseases most Americans have never heard of.

“We have a policy at our center of undertaking difficult projects, quite often which no one else wants to adopt,” Carter said during the press conference.  “Perhaps one of the most vivid examples of this has been guinea worm.”

Since 1986, the center’s efforts have focused on health education, training of health workers and village volunteers who monitor and treat patients.  The center has also supplied simple tools for clean drinking water and village-based education on avoiding the disease.

The greatest threat remains in the world’s newest country, South Sudan, which has about 6,000 villages under surveillance by 12,000 health volunteers.

Calling the remaining cases “unfinished business,” O’Brien said health officials had reason for cautious optimism.  “We know the final mile can often be the longest part of the journey. ”

Jimmy Carter spearheads final drive to eradicate guinea worm disease

£60m needed to finish the job and wipe crippling condition from the planet

jimmy carter eradicate guinea worm

A guinea worm is extracted by a health worker from a child’s foot in Savelugu, Ghana. Photograph: Olivier Asselin/AP

The world is tantalisingly close to eradicating guinea worm disease, which would make it only the second disease of humans to be wiped from the planet, according to former US president Jimmy Carter.

Speaking in London alongside World Health Organisation director general Dr Margaret Chan, Carter, who has led the fight against the disease, said that around £60m more was needed to finish the job.

Since the Carter Centre took up the cause in 1986, almost every nation had eradicated the crippling and painful disease, said the former president. “It is likely by the end of this year we will have guinea worm in only one country – the newest one on earth – South Sudan,” he added.

In 1995 Carter personally negotiated a six-month ceasefire between northern and southern Sudan, in a successful attempt to reach remote villages where guinea worm larvae infest drinking water, causing immense suffering to some of the poorest men, women and children on earth.

“The Carter Centre’s programme is designed to go into the places where the needs are greatest and quite often where the needs are neglected by others,” said the former president. “We couldn’t get into southern Sudan because of the war.”

In 1995 the leaders of north and south agreed the longest-ever ceasefire in the conflict, enabling volunteers to reach remote rural villages. They knew, said Carter, that “guinea worm was a blight on the people. There was an inseparable connection between peace on the one hand and doing away with guinea worm on the other.” Carter eventually helped negotiate peace and his centre monitored the national elections in 2010 and the referendum on separation this year.

Since 1986, 3.5m cases of guinea worm disease in 21 countries have been reduced by 99.9%. Now there are fewer than 1,000 a year.

In 1979, while Carter was president, the eradication of smallpox was declared. That cost £195m and was achieved through mass vaccination – a feat that is being attempted in polio but which looks difficult to repeat with the increased movement of populations.

Guinea worm eradication, a generation later, has so far cost £250m and is close to being achieved without recourse to vaccination or treatments, because they do not exist. The disease is being prevented through the drilling of wells for uncontaminated water and education of those who live in remote rural villages. People have been taught to filter their drinking water through a small pipe, cheaply made and distributed, which removes the guinea worm larvae.

The effort to reach the remotest villages has paid dividends, said Carter. “When we go in to a place like South Sudan, we have personally trained about 12,000 local volunteers and taught them aspects of healthcare and about good water that is clean to drink. We have often been able to dig deep wells that are free from disease.”

There have been other benefits too. “In the rest of their lives, many have never known success. They have never attempted anything that really succeeded. Quite often their relationship to foreigners has comprised broken promises. When we go in and teach them how they can correct their own problem, they not only learn the rudiments of healthcare and sanitation but they learn how to be self-sufficient and gain self-respect,” he said.

Stephen O’Brien, international development minister, pledged on Wednesday the UK government would provide up to one-third of the funding needed for the campaign against the guinea worm. But the amount of the British donation is dependent on how much is put in by others – the Department for International Development will put in £1 for every £2 from elsewhere, he said.

O’Brien added that discussions were taking place with other donors, but that it would be premature to reveal their identities. “I very much hope they will produce a response to the challenge,” he said.

2011 Nobel Prizes

Posted: October 3, 2011 by PaanLuel Wël in Education, Science

Israeli Scientist Wins Nobel Prize for Chemistry

Ariel Schalit/Associated Press

Dan Shechtman in Haifa, Israel, on Wednesday after winning the Nobel Prize in chemistry for discovering quasicrystals.

An Israeli scientist won this year’s Nobel Prize in Chemistry for discovering quasicrystals, a material in which atoms were packed together in a well-defined pattern that never repeats.

An atomic model of an Ag-Al quasicrystal.

Recent Nobel prizes have generally split credit for scientific advances among two or three people, but this year’s chemistry prize and the accompanying 10 million Swedish kronor ($1.4 million) went to a single scientist: Dan Shechtman, 70, a professor of materials science at Technion-Israel Institute of Technology in Haifa, Israel. Dr. Shechtman is also a professor at Iowa State University and a researcher at the United States Department of Energy’s Ames Laboratory.

The citation from the Royal Swedish Academy of Sciences states simply, “for the discovery of quasicrystals.”

Regular but nonrepeating patterns, defined by precise rules, have been known in mathematics since antiquity, and medieval Islamic artists made decorative, nonrepeating tile mosaics, but the phenomenon was thought impossible in the packing of atoms.

Yet Dr. Shechtman discovered the same type of structure in a mixture of aluminum and manganese. During a sabbatical in Maryland at the National Bureau of Standards, now known as the National Institute of Standards and Technology, he took a molten glob of the metals and chilled it rapidly. The expectation was that the atoms would have been a random jumble, like glass. Yet when he examined his metal with an electron microscope, Dr. Shechtman found that the atoms were not random.

His notebook recorded the exact date: April 8, 1982.

Scientists believed that crystals in materials all contained repeating patterns. For example, a square lattice has fourfold symmetry. Rotate it by 90 degrees, and it looks identical. A repeating lattice with fivefold symmetry, however, is impossible. On that morning in 1982, the electrons Dr. Shechtman bounced off his aluminum-manganese alloy formed a pattern that indicated tenfold symmetry. He could not quite believe it. He wrote in his notebook, “10 Fold???”

While a periodic lattice could not produce that pattern, a quasicrystal could.

It took years for Dr. Shechtman to convince others.

During the announcement, the Nobel committee noted that one colleague initially said, “Go away, Danny,” because he thought there was a simpler explanation for what Dr. Shechtman had observed. Many scientists — notably Linus Pauling, the Nobel-winning giant of chemistry — argued vehemently that Dr. Shechtman’s data could be explained by “twinning,” where two ordinary periodic crystals are fused together at an angle.

“That must have been intimidating,” said Nancy B. Jackson, president of the American Chemical Society. “When he first discovered these materials, nobody thought they could exist. It was one of these great scientific stories that his fellow scientists thought was impossible, but through time, people came to realize he was right.”

Even the definition of crystal had to be changed. Previously, a crystal had been defined as having “a regularly ordered, repeating three-dimensional pattern,” according to the International Union of Crystallography. The new definition, adopted in 1992, states that a crystal is simply a solid with a “discrete diffraction diagram” — that is, something that produces patterns like the ones Dr. Shechtman saw.

That leaves the door open for yet more different kinds of crystals in the future. Quasicrystals have since been found in many other materials, including a naturally occurring mineral from a Russian river. Materials scientists have been exploring quasicrystals because of their distinct properties — they are hard, brittle, slippery and, unlike most metals, poor conductors of electricity.

Quasicrystals have so far had a modest impact in the everyday world. For example, one kind of highly resilient steel, consisting of hard steel quasicrystals embedded within softer steel, is now used in razor blades and thin needles for eye surgery.

“The applications haven’t panned out,” said Patricia A. Thiel, a colleague of Dr. Shechtman at Iowa State and Ames Laboratory who also studies quasicrystals. “But they revolutionized our understanding of how atoms arrange themselves in solids. It was a scientific revolution.”

Israeli leaders expressed delight and pride at the 10th Nobel Prize won by a citizen of Israel, which has a population of less than eight million. Two years ago, Ada E. Yonath of the Weizmann Institute of Science in Rehovot, Israel, shared the award for chemistry as well.

Shimon Peres, Israel’s president, spoke by telephone to Dr. Shechtman at a news conference in Haifa and said, “Professor Shechtman, you today brought an enormous gift to the State of Israel, truly.” Prime Minister Benjamin Netanyahu also called and told him, “Every Israeli is happy today, and every Jew in the world is proud.”

Dr. Shechtman was born and educated in Israel. At the news conference, he said, “The celebration is not only for the Technion and the State of Israel but also for science worldwide. There are today thousands of scientists around the world working in this field that I developed, and I am certain they all see this prize as their accomplishment and they really deserve it. Without these thousands, this science would not be where it is today.”

Dr. Shechtman added, “The main lesson that I have learned over time is that a good scientist is a humble and listening scientist and not one that is sure 100 percent in what he reads in the textbooks.”

Ethan Bronner contributed reporting from Jerusalem.

Israeli wins chemistry Nobel for quasicrystals

APBy KARL RITTER and MALIN RISING – Associated Press STOCKHOLM (AP) — Israeli scientist Dan Shechtman was awarded the Nobel Prize in chemistry on Wednesday for a discovery that faced skepticism and mockery, even prompting his expulsion from his research team, before it won widespread acceptance as a fundamental breakthrough.

While doing research in the U.S. in 1982, Shechtman discovered a new chemical structure — quasicrystals — that researchers previously thought was impossible.

He was studying a mix of aluminum and manganese in an electron microscope when he found the atoms were arranged in a pattern — similar to one in some traditional Islamic mosaics — that appeared contrary to the laws of nature.

He concluded that science was wrong — but it would take years for him and other researchers to prove that he was right.

Since then, quasicrystals have been produced in laboratories and a Swedish company found them in one of the most durable kinds of steel, which is now used in products such as razor blades and thin needles made specifically for eye surgery, the Royal Swedish Academy of Sciences said. Quasicrystals are also being studied for use in new materials that convert heat to electricity. They were first discovered in nature in Russia in 2009.

Despite the initial reluctance in the scientific community to accept his discovery, it “fundamentally altered how chemists conceive of solid matter,” the academy said in its citation for the 10 million kronor ($1.5 million) award.

“The main lesson that I have learned over time is that a good scientist is a humble and listening scientist and not one that is sure 100 percent in what he read in the textbooks,” Shechtman, 70, told a news conference Wednesday at the Technion-Israel Institute of Technology in Haifa, Israel.

Shechtman is a professor there and at Iowa State University in Ames, Iowa. He will receive the award along with the other Nobel Prize winners at a Dec. 10 ceremony in Stockholm.

Israel has won 10 Nobel prizes, a source of great pride in the country of just 7.8 million people. Shechtman was congratulated by Israeli President Shimon Peres, who shared the Nobel Peace Prize as Israel’s foreign minister in 1994, and by Prime Minister Benjamin Netanyahu.

“Every citizen of Israel is happy today and every Jew in the world is proud,” Netanyahu said.

In chemical terms, a crystal is traditionally defined as a regular and repeating arrangement of atoms within a material. As a results of these repeats, traditional crystals can have only certain shapes.

What Shechtman found was a material that seemed to have a forbidden shape. Eventually, scientists realized it was a new kind of matter, a quasicrystal, in which the atomic patterns show a more subtle kind of repetition that allows forbidden shapes.

“His battle eventually forced scientists to reconsider their conception of the very nature of matter,” the academy said.

Nancy B. Jackson, president of the American Chemical Society called Shechtman’s discovery “one of these great scientific discoveries that go against the rules.” When Shechtman announced it, other experts hesitated.

“People didn’t think that this kind of crystal existed,” she said. “They thought it was against the rules of nature.”

Only later did some scientists go back to some of their own inexplicable findings and realized they had seen quasicrystals but not realized what they had, Jackson said.

“Anytime you have a discovery that changes the conventional wisdom that’s 200 years old, that’s something that’s really remarkable,” said Princeton University physicist Paul J. Steinhardt, who coined the term “quasicrystals” and had been doing theoretical work on them before Shechtman reported finding the real thing.

Steinhardt recalled the day when a fellow scientist showed him Shechtman’s paper in 1984, reporting the kind of result Steinhardt had predicted. “I sort of leapt in the air,” he said.

Staffan Normark, permanent secretary of the Royal Swedish Academy, said Shechtman’s discovery was one of the few Nobel Prize-winning achievements that can be dated to a single day.

On April 8, 1982, while on a sabbatical at the National Bureau of Standards in Washington, D.C. — now called the National Institute of Standards and Technology — Shechtman first observed crystals with a shape most scientists considered impossible.

It had to do with the idea that a crystal shape can be rotated by a certain amount and still look the same.

A square contains fourfold symmetry, for example: If you turn it by 90 degrees, a quarter-turn, it still looks the same. For crystals, only certain degrees of such symmetry were thought possible. Shechtman had found a crystal that could be rotated one-fifth of a full turn and still look the same, which was thought to be impossible.

“I told everyone who was ready to listen that I had material with pentagonal symmetry. People just laughed at me,” Shechtman said in a description of his work released by his university.

For months he tried to persuade his colleagues of his find, but they refused to accept it. Finally he was asked to leave his research group, and moved to another one within the National Bureau of Standards, Shechtman said.

He returned to Israel, where he found one colleague prepared to work with him on an article describing the phenomenon. The article was at first rejected, but finally published in November 1984 — to uproar in the scientific world. Double Nobel winner Linus Pauling was among those who never accepted the findings.

“He really was a great scientist, but he was wrong. It’s not the first time he was wrong,” Shechtman told reporters Wednesday.

In 1987, friends of Shechtman in France and Japan succeeded in growing crystals large enough for x-rays to repeat and verify what he had discovered with the electron microscope.

“The moment I presented that the community said, ‘OK Dani, now you are talking. Now we understand you, now we accept what you have found,'” Shechtman told reporters.

Cesar Pay Gomez, a structural chemistry expert at Uppsala University in Sweden and an adviser to the prize committee, said research on quasicrystals is ongoing “in the field of thermal-electric applications, where waste heat can be converted to electrical currents or energy.”

The Nobel Prize in chemistry announcement capped this year’s science awards.

Immune system researchers Bruce Beutler of the U.S. and Frenchman Jules Hoffmann shared the medicine prize Monday with Canadian-born Ralph Steinman, who died three days before the announcement. U.S.-born scientists Saul Perlmutter, Brian Schmidt and Adam Riess won the physics prize on Tuesday for discovering that the universe is expanding at an accelerating pace.

The Nobel Prizes are handed out every year on Dec. 10, the anniversary of award founder Alfred Nobel’s death in 1896.


Louise Nordstrom in Stockholm, Malcolm Ritter in New York and Aron Heller in Jerusalem contributed to this report.


Follow Karl Ritter at

Vindicated: Ridiculed Israeli scientist wins Nobel
By ARON HELLER – Associated Press

JERUSALEM (AP) — When Israeli scientist Dan Shechtman claimed to have stumbled upon a new crystalline chemical structure that seemed to violate the laws of nature, colleagues mocked him, insulted him and exiled him from his research group.

After years in the scientific wilderness, though, he was proved right. And on Wednesday, he received the ultimate vindication: the Nobel Prize in chemistry.

The lesson?

“A good scientist is a humble and listening scientist and not one that is sure 100 percent in what he read in the textbooks,” Shechtman said.

The shy, 70-year-old Shechtman said he never doubted his findings and considered himself merely the latest in a long line of scientists who advanced their fields by challenging the conventional wisdom and were shunned by the establishment because of it.

In 1982, Shechtman discovered what are now called “quasicrystals” — atoms arranged in patterns that seemed forbidden by nature.

“I was thrown out of my research group. They said I brought shame on them with what I was saying,” he recalled. “I never took it personally. I knew I was right and they were wrong.”

The discovery “fundamentally altered how chemists conceive of solid matter,” the Royal Swedish Academy of Sciences said in awarding the $1.5 million prize.

Since his discovery, quasicrystals have been produced in laboratories, and a Swedish company found them in one of the most durable kinds of steel, which is now used in products such as razor blades and thin needles made specifically for eye surgery, the academy said. Quasicrystals are also being studied for use in new materials that convert heat to electricity.

Shechtman is a professor at the Technion-Israel Institute of Technology in Haifa, Israel. He is the 10th Israeli Nobel winner, a great source of pride in a nation of just 7.8 million people. Shechtman fielded congratulatory calls from Israeli President Shimon Peres, who shared the Nobel Peace Prize in 1994, and Prime Minister Benjamin Netanyahu.

“Every citizen of Israel is happy today and every Jew in the world is proud,” Netanyahu said.

Staffan Normark, permanent secretary of the Royal Swedish Academy, said Shechtman’s discovery was one of the few Nobel Prize-winning achievements that can be dated to a single day.

On April 8, 1982, while on sabbatical at the National Bureau of Standards in Washington — now called the National Institute of Standards and Technology — Shechtman first observed crystals with a shape most scientists considered impossible.

The discovery had to do with the idea that a crystal shape can be rotated a certain amount and still look the same. A square contains four-fold symmetry, for example: If you turn it by 90 degrees, a quarter-turn, it still looks the same. For crystals, only certain degrees of such symmetry were thought possible. Shechtman had found a crystal that could be rotated one-fifth of a full turn and still look the same.

“I told everyone who was ready to listen that I had material with pentagonal symmetry. People just laughed at me,” he said in an account released by his university.

He was asked to leave his research group, and moved to another one within the National Bureau of Standards, Shechtman said. He eventually returned to Israel, where he found one colleague prepared to work with him on an article describing the phenomenon. The article was at first rejected but was finally published in November 1984 to an uproar in the scientific world.

In 1987, friends in France and Japan succeeded in growing crystals large enough for X-rays to verify what he had discovered with the electron microscope.

“The moment I presented that, the community said, ‘OK, Danny, now you are talking. Now we understand you. Now we accept what you have found,'” Shechtman told reporters.

Shechtman, who also teaches at Iowa State University in Ames, Iowa, said he never wavered even in the face of stiff criticism from double Nobel winner Linus Pauling, who never accepted Shechtman’s findings.

“He would stand on those platforms and declare, ‘Danny Shechtman is talking nonsense. There is no such thing as quasicrystals, only quasi-scientists.'” Shechtman said. “He really was a great scientist, but he was wrong. It’s not the first time he was wrong.”

Shechtman’s battle “eventually forced scientists to reconsider their conception of the very nature of matter,” the academy said.

Nancy B. Jackson, president of the American Chemical Society, called Shechtman’s breakthrough “one of these great scientific discoveries that go against the rules.” Only later did some scientists go back to some of their own inexplicable findings and realize they had seen quasicrystals without understanding what were looking at, Jackson said.

“Anytime you have a discovery that changes the conventional wisdom that’s 200 years old, that’s something that’s really remarkable,” said Princeton University physicist Paul J. Steinhardt, who coined the term “quasicrystals” and had been doing theoretical work on them before Shechtman reported finding the real thing.

Steinhardt recalled the day a fellow scientist showed him Shechtman’s paper in 1984: “I sort of leapt in the air.”


Science writer Malcolm Ritter in New York and Associated Press writers Karl Ritter, Malin Rising and Louise Nordstrom in Stockholm contributed to this report.

Scientist wins Nobel for medicine days after death

STOCKHOLM (AP) — A pioneering researcher was awarded the Nobel Prize in medicine Monday, three days after dying of pancreatic cancer without ever knowing he was about to be honored for his immune system work that he had used to try to prolong his own life.

The Nobel committee said it was unaware that Canadian-born cell biologist Ralph Steinman had already died when it awarded the prize to him, American Bruce Beutler and French scientist Jules Hoffmann.

Since the committee is only supposed to consider living scientists, the Nobel Foundation held an emergency meeting Monday and said the decision on the 10 million kronor ($1.5 million) prize will remain unchanged.

“The Nobel Prize to Ralph Steinman was made in good faith, based on the assumption that the Nobel laureate was alive,” the foundation said.

Steinman, 68, died Sept. 30, according to Rockefeller University in New York. He underwent therapy based on his discovery of the immune system’s dendritic cells, for which he won the prize, the university said.

“He was diagnosed with pancreatic cancer four years ago, and his life was extended using a dendritic-cell based immunotherapy of his own design,” the university said.

Beutler and Hoffmann were cited for their discoveries in the 1990s of receptor proteins that can recognize bacteria and other microorganisms as they enter the body, and activate the first line of defense in the immune system, known as innate immunity.

Nobel committee members said the work by the three is being used to develop better vaccines, and in the long run could also help treatment of diseases linked to abnormalities in the immune system, such as rheumatoid arthritis, Type 1 diabetes, multiple sclerosis and chronic inflammatory diseases.

The work could also help efforts to make the immune system fight cancer, the committee said. A new treatment, Provenge, uses this concept to attack advanced prostate cancer.

Nobel committee member Goran Hansson told The Associated Press that hoped-for vaccines are in the pipeline.

“I am very touched. I’m thinking of all the people who worked with me, who gave everything,” Hoffmann said by telephone to a news conference in Paris. “I wasn’t sure this domain merited a Nobel.”

Beutler said he woke up in the middle of the night, glanced at his cellphone and realized he had a new email message.

“And, I squinted at it and I saw that the title line was ‘Nobel Prize,’ so I thought I should give close attention to that,” Beutler said in an interview posted on the Nobel website. “And, I opened it and it was from Goran Hansson, and it said that I had won the Nobel Prize, and so I was thrilled.”

Still, he was a “little disbelieving” until he checked his laptop, “and in a few minutes I saw my name there and so I knew it was real.”

Since 1974, the Nobel statutes don’t allow posthumous awards unless a laureate dies after the announcement but before the Dec. 10 award ceremony. That happened in 1996 when economics winner William Vickrey died a few days after the announcement.

Before the statutes were changed in 1974 two Nobel Prizes were given posthumously. In 1961, U.N. Secretary-General Dag Hammarskjold was awarded the Nobel Peace Prize less than a month after he died in a plane crash during a peace mission to Congo. Swedish poet Erik Axel Karlfeldt won the Nobel in literature in 1931, although he had died in March of that year.

“The Nobel Foundation thus believes that what has occurred is more reminiscent of the example in the statutes concerning a person who has been named as a Nobel Laureate and has died before the actual Nobel Prize Award Ceremony,” the foundation said following its meeting.

Nobel officials said the situation was unprecedented, and that Steinman’s survivors would receive his share of the prize money. It wasn’t immediately clear who would represent him at the ceremony in Stockholm.

Nobel Foundation chairman Lars Heikensten, who started his job in June, said he was stunned when he found out that Steinman was dead.

“My first thought was: ‘Wow, this is a remarkable thing to happen now that I’m involved in this for the first time. How do we handle this now?'” he told AP.

Hansson said the medicine committee didn’t know Steinman was dead when it chose him.

“It is incredibly sad news,” he said. “We can only regret that he didn’t have the chance to receive the news he had won the Nobel Prize. Our thoughts are now with his family.”

Beutler, 53, holds dual appointments at University of Texas Southwestern Medical Center in Dallas and as professor of genetics and immunology at the Scripps Research Institute in San Diego. He will become a full-time faculty member at UT Southwestern on Dec. 1.

Hoffmann, 70, headed a research laboratory in Strasbourg, France, between 1974 and 2009 and served as president of the French National Academy of Sciences between 2007-08.

Steinman had been head of Rockefeller University’s Center for Immunology and Immune Diseases.

“We are all so touched that our father’s many years of hard work are being recognized with a Nobel Prize,” Steinman’s daughter, Alexis Steinman, said in the Rockefeller University statement. “He devoted his life to his work and his family, and he would be truly honored.”

Hoffmann’s discovery came in 1996 during research on how fruit flies fight infections. Two years later, Beutler’s research on mice showed that fruit flies and mammals activate innate immunity in similar ways when attacked by germs.

Steinman’s discovery dates back to 1973, when he found a new cell type, the dendritic cell, which has a unique capacity to activate T-cells. Those cells have a key role in adaptive immunity, when antibodies and killer cells fight infections. They also develop a memory that helps the immune system mobilize its defenses next time it comes under a similar attack.

The medicine award kicked off a week of Nobel Prize announcements, and will be followed by the physics prize on Tuesday, chemistry on Wednesday, literature on Thursday and the Nobel Peace Prize on Friday. The winners of the economics award will be announced on Oct. 10.

The coveted prizes were established by wealthy Swedish industrialist Alfred Nobel — the inventor of dynamite — except for the economics award, which was created by Sweden’s central bank in 1968 in Nobel’s memory. The prizes are always handed out on Dec. 10, on the anniversary of Nobel’s death in 1896.

Last year’s medicine award went to British professor Robert Edwards for fertility research that led to the first test tube baby.


Associated Press writer Malin Rising contributed to this report.

Speeding universe work wins Nobel

By Anna Ringstrom | ReutersSTOCKHOLM (Reuters) – The “astounding” discovery that the expansion of the universe is speeding up won the Nobel physics prize on Tuesday for three astronomers whose observations of exploding stars transformed our view of the world, and of how it may end.

Honouring two global teams of stargazers whose findings shook cosmology to its foundations in 1998, the Nobel Committee said Americans Saul Perlmutter, Brian Schmidt and Adam Riess showed how the universe that emerged from the Big Bang may fly apart so far, cooling as it goes, that it “will end in ice.”

Their work gave birth to the theory of dark energy, a kind of inverse gravity, that causes the expansion to accelerate. Up to three quarters of the universe seems to comprise dark energy — but just what it is is a matter of speculation, notably at facilities like the Large Hadron Collider at Geneva. Many hope an answer could reconcile apparent anomalies in physics.

The teams studied dozens of exploding stars, or supernovae, expecting to confirm theories dating back to the 1920s that the universe has expanded for 14 billion years since Big Bang, but ever more slowly. Astonished, they found the opposite was true.

“We ended up telling the world we have this crazy result — the universe is speeding up,” the Montana-born Schmidt, based in Australia, said by telephone to the Royal Swedish Academy of Sciences, where the 2011 prizewinners were announced.

“It seemed too crazy to be right, and I think we were a little scared,” added Schmidt, 44, who led the High-z Supernova Search Team that included the Baltimore-based Riess, 41. Schmidt is at the Australian National University in Canberra.

Perlmutter, 52, from the University of California at Berkeley, said: “The chain of analysis was so long that at first we were reluctant to believe our result.

“But the more we analyzed it, the more it wouldn’t go away … It was the longest ‘Aha!’ moment ever.”

If data continues to improve, he believed theorists may be able to understand dark energy within 10 to 15 years.


Riess told Reuters he was “stunned and incredibly honored” by the award. But he was cautious about predictions energy would propel the universe ever outward until it was spent and froze. “It is what we see,” he said. “But the truth is all bets are off. The universe could still recollapse.”

Before, it was thought gravity would eventually reverse its expansion, until a fiery collapse brought the end of the world.

Recalling how it felt to have assumptions confounded, Riess said he spent weeks thinking “I did something stupid” and looking for what he thought must be a mistake in his work: “If you tossed a ball into the air and it kept right on going up instead of falling to the ground, you’d be pretty surprised,” he said. “Well, that’s about how surprised we were.”

With the expectation that gravity would slow the expansion of the universe debunked, the fact that the opposite was true revived an idea Albert Einstein once rejected as his “biggest blunder” — that vacuum of space might create “anti-gravity.”

“Suddenly that idea made sense,” Riess said.

He and Schmidt will share half of the 10 million Swedish crowns ($1.5 million) prize money. Perlmutter won the rest.

Having turned theory on its head, Perlmutter viewed the world’s distant future with equanimity: “It is a tough choice between ending up in the cold or ending up in a fiery blast,” he told Reuters. “I tend not to dwell too much on ultimates.”


Swedish Academy member Lars Brink told Reuters practical developments from the findings were not obvious: “This is very curiosity driven research,” he said. “It tells us something about the basic laws of nature. We are putting together pieces of what are the basic laws of nature. This is one brick.

“It is not that we are going to use it for new gadgets.”

Mark Sullivan, a physicist at the University of Oxford, said: “Their … discovery … has rewritten textbooks, and was one of the landmark breakthroughs of 20th-century physics.”

Among exciting possible developments from the study of dark energy would be a way to reconcile anomalies between laws of physics observed at the subatomic level — quantum mechanics — with those Einstein described for the world we see.

Martin Rees, Britain’s Astronomer Royal, praised the prizewinners but criticized the Nobel Committee’s rules that a maximum of three people could share in an award: “It would have been fairer, and would send a less distorted message about how this kind of science is actually done, if the award had been made collectively to all members of the two groups,” he said.

There was no repeat of the drama in Stockholm on Monday, when the Nobel Committee, whose rules forbid posthumous awards, discovered it had just given a share of the prize for medicine to a man who had died three days earlier. In the end, the award was confirmed to Ralph Steinman, who used his own discoveries to treat his cancer but succumbed to the disease on Friday.

In keeping with many recent prizes, the Committee noted, the winners of the physics category were all relatively young.

At Johns Hopkins University in Baltimore, Riess, who was still in his 20s when the research was published, joked to a colleague that he had been quick to react to a pre-dawn call from Stockholm: “When I picked up the phone early this morning and I heard Swedish voices,” he said, “I knew it wasn’t IKEA.”

(Additional reporting by Mia Shanley, Patrick Lannin and Simon Johnson in Stockholm, Ben Hirschler and Kate Kelland in London, Ian Simpson in Baltimore, Michelle Nichols in New York and Jonathan Weber in San Francisco; Writing by Alastair Macdonald; Editing by Myra MacDonald)

Studies of Universe’s Expansion Win Physics Nobel

Johns Hopkins University; University Of California At Berkeley; Australian National University

From left, Adam Riess, Saul Perlmutter and Brian Schmidt shared the Nobel Prize in physics awarded Tuesday.


Three astronomers won the Nobel Prize on Tuesday for discovering that the universe is apparently being blown apart by a mysterious force that cosmologists now call dark energy, a finding that has thrown the fate of the universe and indeed the nature of physics into doubt.

NASA, via Agence France-Presse — Getty Images

An exploding star known as Type 1a supernova. The Nobel prize winners used them to measure the expansion of the universe.

The astronomers are Saul Perlmutter, 52, of the Lawrence Berkeley National Laboratory and the University of California, Berkeley; Brian P. Schmidt, 44, of the Australian National University in Canberra, and Adam G. Riess, 41, of the Space Telescope Science Institute and Johns Hopkins University in Baltimore.

“I’m stunned,” Dr. Riess said by e-mail, after learning of his prize by reading about it on The New York Times’s Web site.

The three men led two competing teams of astronomers who were trying to use the exploding stars known as Type 1a supernovae as cosmic lighthouses to limn the expansion of the universe. The goal of both groups was to measure how fast the cosmos, which has been expanding since its fiery birth in the Big Bang 13.7 billion years ago, was slowing down, and thus to find out if its ultimate fate was to fall back together in what is called a Big Crunch or to drift apart into the darkness.

Instead, the two groups found in 1998 that the expansion of the universe was actually speeding up, a conclusion that nobody would have believed if not for the fact that both sets of scientists wound up with the same answer. It was as if, when you tossed your car keys in the air, instead of coming down, they flew faster and faster to the ceiling.

Subsequent cosmological measurements have confirmed that roughly 70 percent of the universe by mass or energy consists of this antigravitational dark energy that is pushing the galaxies apart, though astronomers and physicists have no conclusive evidence of what it is.

The most likely explanation for this bizarre behavior is a fudge factor that Albert Einstein introduced into his equations in 1917 to stabilize the universe against collapse and then abandoned as his greatest blunder.

Quantum theory predicts that empty space should exert a repulsive force, like dark energy, but one that is 10 to the 120th power times stronger than what the astronomers have measured, leaving some physicists mumbling about multiple universes. Abandoning the Einsteinian dream of a single final theory of nature, they speculate that there are a multitude of universes with different properties. We live in one, the argument goes, that is suitable for life.

“Every test we have made has come out perfectly in line with Einstein’s original cosmological constant in 1917,” Dr. Schmidt said.

If the universe continues accelerating, astronomers say, rather than coasting gently into the night, distant galaxies will eventually be moving apart so quickly that they cannot communicate with one another and all the energy would be sucked out of the universe.

Edward Witten, a theorist at the Institute for Advanced Study, Einstein’s old stomping grounds, called dark energy “the most startling discovery in physics since I have been in the field.” Dr. Witten continued, “It was so startling, in fact, that I personally took quite a while to become convinced that it was right.”

He went on, “This discovery definitely changed the way physicists look at the universe, and we probably still haven’t fully come to grips with the implications.”

Dr. Perlmutter, who led the Supernova Cosmology Project out of Berkeley, will get half of the prize of 10 million Swedish kronor ($1.4 million). The other half will go to Dr. Schmidt, leader of the rival High-Z Supernova Search Team, and Dr. Riess, who was the lead author of the 1998 paper in The Astronomical Journal, in which the dark energy result was first published.

All three astronomers were born and raised in the United States; Dr. Schmidt is also a citizen of Australia. They will get their prizes in Stockholm on Dec. 10.

Since the fate of the universe is in question, astronomers would love to do more detailed tests using supernovas and other observations. So they were dispirited last year when NASA announced that cost overruns and delays on the James Webb Space Telescope had left no room in the budget until the next decade for an American satellite mission to investigate dark energy that Dr. Perlmutter and others had been promoting for almost a decade. Indeed on Tuesday the European Space Agency announced that it would launch a mission called Euclid to study dark energy in 2019.

Cosmic expansion was discovered by Edwin Hubble, an astronomer at the Mount Wilson Observatory in Pasadena, Calif., in 1929, but the quest for precision measurements of the universe has been hindered by a lack of reliable standard candles, objects whose distance can be inferred by their brightness or some other observable characteristic. Type 1a supernovae, which are thought to result from explosions of small stars known as white dwarfs, have long been considered uniform enough to fill the bill, as well as bright enough to be seen across the universe.

In the late 1980s Dr. Perlmutter, who had just gotten a Ph.D. in physics, devised an elaborate plan involving networks of telescopes tied together by the Internet to detect and study such supernovae and use them to measure the presumed deceleration of the universe. The Supernova Cosmology Project endured criticism from other astronomers, particularly supernova experts, who doubted that particle physicists could do it right.

Indeed, it took seven years before Dr. Perlmutter’s team began harvesting supernovae in the numbers they needed. Meanwhile, the other astronomers had formed their own team, the High-Z team, to do the same work.

“Hey, what’s the strongest force in the universe?” asked Robert P. Kirshner of the Harvard-Smithsonian Center for Astrophysics, and a mentor to many of the astronomers on the new team, told a reporter from this newspaper once, “It’s not gravity, it’s jealousy.”

In an interview with The Associated Press, Dr. Perlmutter described the subsequent work of the teams as “a long aha.” The presence of dark energy showed up in an expected faintness on the part of some distant supernovas: the universe had sped up and carried them farther away from us than conventional cosmology suggested.

As recounted by the science writer Richard Panek in his recent book, “The 4% Universe, Dark Matter, Dark Energy, and the Race to Discover the Rest of Reality,” neither team was eager to report such a strange result.

In January 1998, Dr. Riess interrupted preparations for his honeymoon to buck up his comrades. “Approach these results not with your heart or head but with your eyes,” he wrote in an e-mail. “We are observers after all!”

In the years since, the three astronomers have shared a number of awards, sometimes giving lectures in which they completed each other’s sentences. A Nobel was expected eventually.

“No more waiting!” Dr. Kirshner said Tuesday.

Swedish Poet Wins Nobel Prize for Literature

Maja Suslin/European Pressphoto Agency

Swedish poet Tomas Transtromer at his home in Stockholm on Thursday after receiving the news that he won the 2011 Nobel Prize in Literature.

Announcing the award in Stockholm, the Swedish Academy praised Mr. Transtromer, saying that “through his condensed, translucent images, he gives us fresh access to reality.”

The assembled journalists cheered upon hearing that Mr. Transtromer, who was born in Stockholm, had won the prize.

Mr. Transtromer, 80, has written more than 15 collections of poetry, many of which have been translated into English and 60 other languages.

Critics have praised Mr. Transtromer’s poems for their accessibility, even in translation, noting his elegant descriptions of long Swedish winters, the rhythm of the seasons and the palpable, atmospheric beauty of nature.

“So much poetry, not only in this country but everywhere, is small and personal and it doesn’t look outward, it looks inward,” said Daniel Halpern, the president and publisher of Ecco, the imprint of HarperCollins that has published English translations of Mr. Transtromer’s work. “But there are some poets who write true international poetry. It’s the sensibility that runs through his poems that is so seductive. He is such a curious and open and intelligent writer.”

Neil Astley, the editor of Bloodaxe Books in Britain, called Mr. Transtromer “a metaphysical visionary poet.”

“He’s worked for much of his life as a psychologist, and the work is characterized by very strong psychological insight into humanity,” Mr. Astley said.

Mr. Transtromer was born in Stockholm in 1931. His mother was a schoolteacher and his father a journalist. He studied literature, history, religion and psychology at Stockholm University, graduating in 1956, and worked as a psychologist at a youth correctional facility.

In 1990, Mr. Transtromer suffered a stroke that left him mostly unable to speak, but he eventually began to write again.

On Thursday afternoon, the stairwell in Mr. Transtromer’s apartment building filled with journalists from all over the world seeking reaction, the Swedish news media reported.

Visibly overwhelmed, Mr. Transtromer finally appeared, accompanied by his wife, Monica. Speaking on his behalf, she said her husband was most happy that the prize was awarded for poetry. “That you happened to receive it is a great joy and happy surprise, but the fact the prize went to poetry felt very good,” she said, addressing him at a gathering that quickly moved into the vestibule of their home in Stockholm.

There was also a celebration among Swedes, many of whom have read Mr. Transtromer since his first book of poems, “17 Poems,” placed him on Sweden’s literary map when he was just 23.

“To be quite honest it was a relief because people have been hoping for this for a long time,” said Ola Larsmo, a novelist and the president of the Swedish Pen association. “Some thought the train might have left the station already because he is old and not quite well. It felt great that he was confirmed in this role of national and international poet.”

John Freeman, the editor of the literary magazine Granta, said: “He is to Sweden what Robert Frost was to America. The national character, if you can say one exists, and the landscape of Sweden are very much reflected in his work. It’s easy because of that to overlook the abiding strangeness and mysteriousness of his poems.”

But in the United States, Mr. Transtromer is a virtual unknown, even to many readers of poetry, despite the fact that he has been published in English by several widely known publishers.

Mr. Halpern said that “Selected Poems,” originally published in 2000 by Ecco, part of HarperCollins, would be rereleased within days. On Thursday morning, print copies of his books were already backordered on online retailer sites, and electronic versions were difficult to find. New Directions, an independent publisher, released “The Great Enigma,” a poetry collection, in 2006; Graywolf Press, a publisher based in Minneapolis, released “The Half-Finished Heaven” in 2001.

Jeff Seroy, a spokesman for Farrar, Straus & Giroux, part of Macmillan, said Thursday that the imprint had acquired a volume of Mr. Transtromer’s work, translated by Robin Robertson, called “The Deleted World,” originally published in 2006. Mr. Seroy said the book would be released by year’s end.

Much of Mr. Transtromer’s work, including “The Half-Finished Heaven,” was translated by his close friend and fellow poet Robert Bly. Mr. Bly has been named as one of the central people who introduced Mr. Transtromer to a small but devoted group of American readers.

The selection of a European writer for the literature Nobel — the eighth in a decade — renewed criticisms that the prize is too Eurocentric. The last American writer to win a Nobel was Toni Morrison in 1993. Philip Roth has been a perennial favorite but has not been selected.

The committee noted after the announcement on Thursday that it had been many years since a Swede had won. It last happened in 1974 when Eyvind Johnson and Harry Martinson shared the prize.

Peter Englund, the permanent secretary of the academy, said this week that the literature jury had increased the number of “scouts” it employed to scour for books in non-European languages.

And once again, the jury proved its inscrutability. In previous years, the choice of relatively unknown writers like Herta Müller of Germany has surprised Nobel watchers; in other years, winners like Harold Pinter or Orhan Pamuk have raised questions about whether the Nobel committee is overly influenced by politics.

While Mr. Transtromer has been a longtime favorite to win the Nobel, he has also won other prizes, including the Neustadt International Prize for Literature, the Bonnier Award for Poetry, the Petrarch Prize in Germany and the Bellman Prize.

The Nobel Prize comes with an honorarium of nearly $1.5 million.

2045: The Year Man Becomes Immortal

Posted: August 8, 2011 by PaanLuel Wël in Science

Time Magazine: On Feb. 15, 1965, a diffident but self-possessed high school student named Raymond Kurzweil appeared as a guest on a game show called I’ve Got a Secret. He was introduced by the host, Steve Allen, then he played a short musical composition on a piano. The idea was that Kurzweil was hiding an unusual fact and the panelists — they included a comedian and a former Miss America — had to guess what it was.

Photo–Illustration by Phillip Toledano for TIME

On the show (see the clip on YouTube), the beauty queen did a good job of grilling Kurzweil, but the comedian got the win: the music was composed by a computer. Kurzweil got $200.(Watch TIME’s video “Singularity: How Scared Should We Be?”)

Kurzweil then demonstrated the computer, which he built himself — a desk-size affair with loudly clacking relays, hooked up to a typewriter. The panelists were pretty blasé about it; they were more impressed by Kurzweil’s age than by anything he’d actually done. They were ready to move on to Mrs. Chester Loney of Rough and Ready, Calif., whose secret was that she’d been President Lyndon Johnson’s first-grade teacher.

But Kurzweil would spend much of the rest of his career working out what his demonstration meant. Creating a work of art is one of those activities we reserve for humans and humans only. It’s an act of self-expression; you’re not supposed to be able to do it if you don’t have a self. To see creativity, the exclusive domain of humans, usurped by a computer built by a 17-year-old is to watch a line blur that cannot be unblurred, the line between organic intelligence and artificial intelligence.

That was Kurzweil’s real secret, and back in 1965 nobody guessed it. Maybe not even him, not yet. But now, 46 years later, Kurzweil believes that we’re approaching a moment when computers will become intelligent, and not just intelligent but more intelligent than humans. When that happens, humanity — our bodies, our minds, our civilization — will be completely and irreversibly transformed. He believes that this moment is not only inevitable but imminent. According to his calculations, the end of human civilization as we know it is about 35 years away.(See the best inventions of 2010.)

Computers are getting faster. Everybody knows that. Also, computers are getting faster faster — that is, the rate at which they’re getting faster is increasing.

True? True.

So if computers are getting so much faster, so incredibly fast, there might conceivably come a moment when they are capable of something comparable to human intelligence. Artificial intelligence. All that horsepower could be put in the service of emulating whatever it is our brains are doing when they create consciousness — not just doing arithmetic very quickly or composing piano music but also driving cars, writing books, making ethical decisions, appreciating fancy paintings, making witty observations at cocktail parties.

If you can swallow that idea, and Kurzweil and a lot of other very smart people can, then all bets are off. From that point on, there’s no reason to think computers would stop getting more powerful. They would keep on developing until they were far more intelligent than we are. Their rate of development would also continue to increase, because they would take over their own development from their slower-thinking human creators. Imagine a computer scientist that was itself a super-intelligent computer. It would work incredibly quickly. It could draw on huge amounts of data effortlessly. It wouldn’t even take breaks to play Farmville.(See the best inventions of 2010.)

Probably. It’s impossible to predict the behavior of these smarter-than-human intelligences with which (with whom?) we might one day share the planet, because if you could, you’d be as smart as they would be. But there are a lot of theories about it. Maybe we’ll merge with them to become super-intelligent cyborgs, using computers to extend our intellectual abilities the same way that cars and planes extend our physical abilities. Maybe the artificial intelligences will help us treat the effects of old age and prolong our life spans indefinitely. Maybe we’ll scan our consciousnesses into computers and live inside them as software, forever, virtually. Maybe the computers will turn on humanity and annihilate us. The one thing all these theories have in common is the transformation of our species into something that is no longer recognizable as such to humanity circa 2011. This transformation has a name: the Singularity.

The difficult thing to keep sight of when you’re talking about the Singularity is that even though it sounds like science fiction, it isn’t, no more than a weather forecast is science fiction. It’s not a fringe idea; it’s a serious hypothesis about the future of life on Earth. There’s an intellectual gag reflex that kicks in anytime you try to swallow an idea that involves super-intelligent immortal cyborgs, but suppress it if you can, because while the Singularity appears to be, on the face of it, preposterous, it’s an idea that rewards sober, careful evaluation

People are spending a lot of money trying to understand it. The three-year-old Singularity University, which offers inter-disciplinary courses of study for graduate students and executives, is hosted by NASA. Google was a founding sponsor; its CEO and co-founder Larry Page spoke there last year. People are attracted to the Singularity for the shock value, like an intellectual freak show, but they stay because there’s more to it than they expected. And of course, in the event that it turns out to be real, it will be the most important thing to happen to human beings since the invention of language.(See “Is Technology Making Us Lonelier?”)

The Singularity isn’t a wholly new idea, just newish. In 1965 the British mathematician I.J. Good described something he called an “intelligence explosion”:

Let an ultraintelligent machine be defined as a machine that can far surpass all the intellectual activities of any man however clever. Since the design of machines is one of these intellectual activities, an ultraintelligent machine could design even better machines; there would then unquestionably be an “intelligence explosion,” and the intelligence of man would be left far behind. Thus the first ultraintelligent machine is the last invention that man need ever make.

The word singularity is borrowed from astrophysics: it refers to a point in space-time — for example, inside a black hole — at which the rules of ordinary physics do not apply. In the 1980s the science-fiction novelist Vernor Vinge attached it to Good’s intelligence-explosion scenario. At a NASA symposium in 1993, Vinge announced that “within 30 years, we will have the technological means to create super-human intelligence. Shortly after, the human era will be ended.”

By that time Kurzweil was thinking about the Singularity too. He’d been busy since his appearance on I’ve Got a Secret. He’d made several fortunes as an engineer and inventor; he founded and then sold his first software company while he was still at MIT. He went on to build the first print-to-speech reading machine for the blind — Stevie Wonder was customer No. 1 — and made innovations in a range of technical fields, including music synthesizers and speech recognition. He holds 39 patents and 19 honorary doctorates. In 1999 President Bill Clinton awarded him the National Medal of Technology.(See pictures of adorable robots.)

But Kurzweil was also pursuing a parallel career as a futurist: he has been publishing his thoughts about the future of human and machine-kind for 20 years, most recently in The Singularity Is Near, which was a best seller when it came out in 2005. A documentary by the same name, starring Kurzweil, Tony Robbins and Alan Dershowitz, among others, was released in January. (Kurzweil is actually the subject of two current documentaries. The other one, less authorized but more informative, is called The Transcendent Man.) Bill Gates has called him “the best person I know at predicting the future of artificial intelligence.”(See the world’s most influential people in the 2010 TIME 100.)

In real life, the transcendent man is an unimposing figure who could pass for Woody Allen’s even nerdier younger brother. Kurzweil grew up in Queens, N.Y., and you can still hear a trace of it in his voice. Now 62, he speaks with the soft, almost hypnotic calm of someone who gives 60 public lectures a year. As the Singularity’s most visible champion, he has heard all the questions and faced down the incredulity many, many times before. He’s good-natured about it. His manner is almost apologetic: I wish I could bring you less exciting news of the future, but I’ve looked at the numbers, and this is what they say, so what else can I tell you?

Kurzweil’s interest in humanity’s cyborganic destiny began about 1980 largely as a practical matter. He needed ways to measure and track the pace of technological progress. Even great inventions can fail if they arrive before their time, and he wanted to make sure that when he released his, the timing was right. “Even at that time, technology was moving quickly enough that the world was going to be different by the time you finished a project,” he says. “So it’s like skeet shooting — you can’t shoot at the target.” He knew about Moore’s law, of course, which states that the number of transistors you can put on a microchip doubles about every two years. It’s a surprisingly reliable rule of thumb. Kurzweil tried plotting a slightly different curve: the change over time in the amount of computing power, measured in MIPS (millions of instructions per second), that you can buy for $1,000.

As it turned out, Kurzweil’s numbers looked a lot like Moore’s. They doubled every couple of years. Drawn as graphs, they both made exponential curves, with their value increasing by multiples of two instead of by regular increments in a straight line. The curves held eerily steady, even when Kurzweil extended his backward through the decades of pretransistor computing technologies like relays and vacuum tubes, all the way back to 1900.(Comment on this story.)

Kurzweil then ran the numbers on a whole bunch of other key technological indexes — the falling cost of manufacturing transistors, the rising clock speed of microprocessors, the plummeting price of dynamic RAM. He looked even further afield at trends in biotech and beyond — the falling cost of sequencing DNA and of wireless data service and the rising numbers of Internet hosts and nanotechnology patents. He kept finding the same thing: exponentially accelerating progress. “It’s really amazing how smooth these trajectories are,” he says. “Through thick and thin, war and peace, boom times and recessions.” Kurzweil calls it the law of accelerating returns: technological progress happens exponentially, not linearly.

Then he extended the curves into the future, and the growth they predicted was so phenomenal, it created cognitive resistance in his mind. Exponential curves start slowly, then rocket skyward toward infinity. According to Kurzweil, we’re not evolved to think in terms of exponential growth. “It’s not intuitive. Our built-in predictors are linear. When we’re trying to avoid an animal, we pick the linear prediction of where it’s going to be in 20 seconds and what to do about it. That is actually hardwired in our brains.”

Here’s what the exponential curves told him. We will successfully reverse-engineer the human brain by the mid-2020s. By the end of that decade, computers will be capable of human-level intelligence. Kurzweil puts the date of the Singularity — never say he’s not conservative — at 2045. In that year, he estimates, given the vast increases in computing power and the vast reductions in the cost of same, the quantity of artificial intelligence created will be about a billion times the sum of all the human intelligence that exists today.(See how robotics are changing the future of medicine.)

The Singularity isn’t just an idea. it attracts people, and those people feel a bond with one another. Together they form a movement, a subculture; Kurzweil calls it a community. Once you decide to take the Singularity seriously, you will find that you have become part of a small but intense and globally distributed hive of like-minded thinkers known as Singularitarians.

Not all of them are Kurzweilians, not by a long chalk. There’s room inside Singularitarianism for considerable diversity of opinion about what the Singularity means and when and how it will or won’t happen. But Singularitarians share a worldview. They think in terms of deep time, they believe in the power of technology to shape history, they have little interest in the conventional wisdom about anything, and they cannot believe you’re walking around living your life and watching TV as if the artificial-intelligence revolution were not about to erupt and change absolutely everything. They have no fear of sounding ridiculous; your ordinary citizen’s distaste for apparently absurd ideas is just an example of irrational bias, and Singularitarians have no truck with irrationality. When you enter their mind-space you pass through an extreme gradient in worldview, a hard ontological shear that separates Singularitarians from the common run of humanity. Expect turbulence.

In addition to the Singularity University, which Kurzweil co-founded, there’s also a Singularity Institute for Artificial Intelligence, based in San Francisco. It counts among its advisers Peter Thiel, a former CEO of PayPal and an early investor in Facebook. The institute holds an annual conference called the Singularity Summit. (Kurzweil co-founded that too.) Because of the highly interdisciplinary nature of Singularity theory, it attracts a diverse crowd. Artificial intelligence is the main event, but the sessions also cover the galloping progress of, among other fields, genetics and nanotechnology.(See TIME’s computer covers.)

At the 2010 summit, which took place in August in San Francisco, there were not just computer scientists but also psychologists, neuroscientists, nanotechnologists, molecular biologists, a specialist in wearable computers, a professor of emergency medicine, an expert on cognition in gray parrots and the professional magician and debunker James “the Amazing” Randi. The atmosphere was a curious blend of Davos and UFO convention. Proponents of seasteading — the practice, so far mostly theoretical, of establishing politically autonomous floating communities in international waters — handed out pamphlets. An android chatted with visitors in one corner.

After artificial intelligence, the most talked-about topic at the 2010 summit was life extension. Biological boundaries that most people think of as permanent and inevitable Singularitarians see as merely intractable but solvable problems. Death is one of them. Old age is an illness like any other, and what do you do with illnesses? You cure them. Like a lot of Singularitarian ideas, it sounds funny at first, but the closer you get to it, the less funny it seems. It’s not just wishful thinking; there’s actual science going on here.

For example, it’s well known that one cause of the physical degeneration associated with aging involves telomeres, which are segments of DNA found at the ends of chromosomes. Every time a cell divides, its telomeres get shorter, and once a cell runs out of telomeres, it can’t reproduce anymore and dies. But there’s an enzyme called telomerase that reverses this process; it’s one of the reasons cancer cells live so long. So why not treat regular non-cancerous cells with telomerase? In November, researchers at Harvard Medical School announced in Nature that they had done just that. They administered telomerase to a group of mice suffering from age-related degeneration. The damage went away. The mice didn’t just get better; they got younger.(Comment on this story.)

Aubrey de Grey is one of the world’s best-known life-extension researchers and a Singularity Summit veteran. A British biologist with a doctorate from Cambridge and a famously formidable beard, de Grey runs a foundation called SENS, or Strategies for Engineered Negligible Senescence. He views aging as a process of accumulating damage, which he has divided into seven categories, each of which he hopes to one day address using regenerative medicine. “People have begun to realize that the view of aging being something immutable — rather like the heat death of the universe — is simply ridiculous,” he says. “It’s just childish. The human body is a machine that has a bunch of functions, and it accumulates various types of damage as a side effect of the normal function of the machine. Therefore in principal that damage can be repaired periodically. This is why we have vintage cars. It’s really just a matter of paying attention. The whole of medicine consists of messing about with what looks pretty inevitable until you figure out how to make it not inevitable.”

Kurzweil takes life extension seriously too. His father, with whom he was very close, died of heart disease at 58. Kurzweil inherited his father’s genetic predisposition; he also developed Type 2 diabetes when he was 35. Working with Terry Grossman, a doctor who specializes in longevity medicine, Kurzweil has published two books on his own approach to life extension, which involves taking up to 200 pills and supplements a day. He says his diabetes is essentially cured, and although he’s 62 years old from a chronological perspective, he estimates that his biological age is about 20 years younger.


But his goal differs slightly from de Grey’s. For Kurzweil, it’s not so much about staying healthy as long as possible; it’s about staying alive until the Singularity. It’s an attempted handoff. Once hyper-intelligent artificial intelligences arise, armed with advanced nanotechnology, they’ll really be able to wrestle with the vastly complex, systemic problems associated with aging in humans. Alternatively, by then we’ll be able to transfer our minds to sturdier vessels such as computers and robots. He and many other Singularitarians take seriously the proposition that many people who are alive today will wind up being functionally immortal.

It’s an idea that’s radical and ancient at the same time. In “Sailing to Byzantium,” W.B. Yeats describes mankind’s fleshly predicament as a soul fastened to a dying animal. Why not unfasten it and fasten it to an immortal robot instead? But Kurzweil finds that life extension produces even more resistance in his audiences than his exponential growth curves. “There are people who can accept computers being more intelligent than people,” he says. “But the idea of significant changes to human longevity — that seems to be particularly controversial. People invested a lot of personal effort into certain philosophies dealing with the issue of life and death. I mean, that’s the major reason we have religion.”(See the top 10 medical breakthroughs of 2010.)

Of course, a lot of people think the Singularity is nonsense — a fantasy, wishful thinking, a Silicon Valley version of the Evangelical story of the Rapture, spun by a man who earns his living making outrageous claims and backing them up with pseudoscience. Most of the serious critics focus on the question of whether a computer can truly become intelligent.

The entire field of artificial intelligence, or AI, is devoted to this question. But AI doesn’t currently produce the kind of intelligence we associate with humans or even with talking computers in movies — HAL or C3PO or Data. Actual AIs tend to be able to master only one highly specific domain, like interpreting search queries or playing chess. They operate within an extremely specific frame of reference. They don’t make conversation at parties. They’re intelligent, but only if you define intelligence in a vanishingly narrow way. The kind of intelligence Kurzweil is talking about, which is called strong AI or artificial general intelligence, doesn’t exist yet.

Why not? Obviously we’re still waiting on all that exponentially growing computing power to get here. But it’s also possible that there are things going on in our brains that can’t be duplicated electronically no matter how many MIPS you throw at them. The neurochemical architecture that generates the ephemeral chaos we know as human consciousness may just be too complex and analog to replicate in digital silicon. The biologist Dennis Bray was one of the few voices of dissent at last summer’s Singularity Summit. “Although biological components act in ways that are comparable to those in electronic circuits,” he argued, in a talk titled “What Cells Can Do That Robots Can’t,” “they are set apart by the huge number of different states they can adopt. Multiple biochemical processes create chemical modifications of protein molecules, further diversified by association with distinct structures at defined locations of a cell. The resulting combinatorial explosion of states endows living systems with an almost infinite capacity to store information regarding past and present conditions and a unique capacity to prepare for future events.” That makes the ones and zeros that computers trade in look pretty crude.(See how to live 100 years.)

Underlying the practical challenges are a host of philosophical ones. Suppose we did create a computer that talked and acted in a way that was indistinguishable from a human being — in other words, a computer that could pass the Turing test. (Very loosely speaking, such a computer would be able to pass as human in a blind test.) Would that mean that the computer was sentient, the way a human being is? Or would it just be an extremely sophisticated but essentially mechanical automaton without the mysterious spark of consciousness — a machine with no ghost in it? And how would we know?

Even if you grant that the Singularity is plausible, you’re still staring at a thicket of unanswerable questions. If I can scan my consciousness into a computer, am I still me? What are the geopolitics and the socioeconomics of the Singularity? Who decides who gets to be immortal? Who draws the line between sentient and nonsentient? And as we approach immortality, omniscience and omnipotence, will our lives still have meaning? By beating death, will we have lost our essential humanity?

Kurzweil admits that there’s a fundamental level of risk associated with the Singularity that’s impossible to refine away, simply because we don’t know what a highly advanced artificial intelligence, finding itself a newly created inhabitant of the planet Earth, would choose to do. It might not feel like competing with us for resources. One of the goals of the Singularity Institute is to make sure not just that artificial intelligence develops but also that the AI is friendly. You don’t have to be a super-intelligent cyborg to understand that introducing a superior life-form into your own biosphere is a basic Darwinian error.(Comment on this story.)

If the Singularity is coming, these questions are going to get answers whether we like it or not, and Kurzweil thinks that trying to put off the Singularity by banning technologies is not only impossible but also unethical and probably dangerous. “It would require a totalitarian system to implement such a ban,” he says. “It wouldn’t work. It would just drive these technologies underground, where the responsible scientists who we’re counting on to create the defenses would not have easy access to the tools.”

Kurzweil is an almost inhumanly patient and thorough debater. He relishes it. He’s tireless in hunting down his critics so that he can respond to them, point by point, carefully and in detail.

Take the question of whether computers can replicate the biochemical complexity of an organic brain. Kurzweil yields no ground there whatsoever. He does not see any fundamental difference between flesh and silicon that would prevent the latter from thinking. He defies biologists to come up with a neurological mechanism that could not be modeled or at least matched in power and flexibility by software running on a computer. He refuses to fall on his knees before the mystery of the human brain. “Generally speaking,” he says, “the core of a disagreement I’ll have with a critic is, they’ll say, Oh, Kurzweil is underestimating the complexity of reverse-engineering of the human brain or the complexity of biology. But I don’t believe I’m underestimating the challenge. I think they’re underestimating the power of exponential growth.”

This position doesn’t make Kurzweil an outlier, at least among Singularitarians. Plenty of people make more-extreme predictions. Since 2005 the neuroscientist Henry Markram has been running an ambitious initiative at the Brain Mind Institute of the Ecole Polytechnique in Lausanne, Switzerland. It’s called the Blue Brain project, and it’s an attempt to create a neuron-by-neuron simulation of a mammalian brain, using IBM’s Blue Gene super-computer. So far, Markram’s team has managed to simulate one neocortical column from a rat’s brain, which contains about 10,000 neurons. Markram has said that he hopes to have a complete virtual human brain up and running in 10 years. (Even Kurzweil sniffs at this. If it worked, he points out, you’d then have to educate the brain, and who knows how long that would take?)(See portraits of centenarians.)

By definition, the future beyond the Singularity is not knowable by our linear, chemical, animal brains, but Kurzweil is teeming with theories about it. He positively flogs himself to think bigger and bigger; you can see him kicking against the confines of his aging organic hardware. “When people look at the implications of ongoing exponential growth, it gets harder and harder to accept,” he says. “So you get people who really accept, yes, things are progressing exponentially, but they fall off the horse at some point because the implications are too fantastic. I’ve tried to push myself to really look.”

In Kurzweil’s future, biotechnology and nanotechnology give us the power to manipulate our bodies and the world around us at will, at the molecular level. Progress hyperaccelerates, and every hour brings a century’s worth of scientific breakthroughs. We ditch Darwin and take charge of our own evolution. The human genome becomes just so much code to be bug-tested and optimized and, if necessary, rewritten. Indefinite life extension becomes a reality; people die only if they choose to. Death loses its sting once and for all. Kurzweil hopes to bring his dead father back to life.

We can scan our consciousnesses into computers and enter a virtual existence or swap our bodies for immortal robots and light out for the edges of space as intergalactic godlings. Within a matter of centuries, human intelligence will have re-engineered and saturated all the matter in the universe. This is, Kurzweil believes, our destiny as a species.(See the costs of living a long life.)

Or it isn’t. When the big questions get answered, a lot of the action will happen where no one can see it, deep inside the black silicon brains of the computers, which will either bloom bit by bit into conscious minds or just continue in ever more brilliant and powerful iterations of nonsentience.

But as for the minor questions, they’re already being decided all around us and in plain sight. The more you read about the Singularity, the more you start to see it peeking out at you, coyly, from unexpected directions. Five years ago we didn’t have 600 million humans carrying out their social lives over a single electronic network. Now we have Facebook. Five years ago you didn’t see people double-checking what they were saying and where they were going, even as they were saying it and going there, using handheld network-enabled digital prosthetics. Now we have iPhones. Is it an unimaginable step to take the iPhones out of our hands and put them into our skulls?

Already 30,000 patients with Parkinson’s disease have neural implants. Google is experimenting with computers that can drive cars. There are more than 2,000 robots fighting in Afghanistan alongside the human troops. This month a game show will once again figure in the history of artificial intelligence, but this time the computer will be the guest: an IBM super-computer nicknamed Watson will compete onJeopardy! Watson runs on 90 servers and takes up an entire room, and in a practice match in January it finished ahead of two former champions, Ken Jennings and Brad Rutter. It got every question it answered right, but much more important, it didn’t need help understanding the questions (or, strictly speaking, the answers), which were phrased in plain English. Watson isn’t strong AI, but if strong AI happens, it will arrive gradually, bit by bit, and this will have been one of the bits.(Comment on this story.)

A hundred years from now, Kurzweil and de Grey and the others could be the 22nd century’s answer to the Founding Fathers — except unlike the Founding Fathers, they’ll still be alive to get credit — or their ideas could look as hilariously retro and dated as Disney’s Tomorrowland. Nothing gets old as fast as the future.

But even if they’re dead wrong about the future, they’re right about the present. They’re taking the long view and looking at the big picture. You may reject every specific article of the Singularitarian charter, but you should admire Kurzweil for taking the future seriously. Singularitarianism is grounded in the idea that change is real and that humanity is in charge of its own fate and that history might not be as simple as one damn thing after another. Kurzweil likes to point out that your average cell phone is about a millionth the size of, a millionth the price of and a thousand times more powerful than the computer he had at MIT 40 years ago. Flip that forward 40 years and what does the world look like? If you really want to figure that out, you have to think very, very far outside the box. Or maybe you have to think further inside it than anyone ever has before.

The growing notion that for any policy -including social policy – to be successful it must be evidence based or informed, has been shattered by Uganda’s successful story of a largely well controlled and managed HIV and AIDS epidemic. John Kinsman, in this groundbreaking book, which boldly departs from the usual evidence/science-is-supreme flow, argues that Uganda’s spectacular ABC (abstinence, behaviour change and condoms) strategy originated from no more than a “hunch”. Intrigued by what could actually have guided AIDS control in Uganda, the author investigated a flagship research which was set up in the early days of the epidemic to guide policy.

The research – Masaka Intervention Trials – which sought to promote delay in sexual debut among youngsters, zero-grazing (limiting the number of sexual partners) and treatment of STDs, conducted from 1994 – 1998 found that HIV incidence was identical in both intervention and control populations. Simply and bluntly put, the much coveted trial interventions had remarkably failed. And yet, as argued by many – scientists, politicians and ordinary people alike – since there was a general trend of improvement in the HIV and AIDS situation over time, Uganda must have been doing something right, and the “right thing” was precisely these interventions. Uganda’s HIV prevalence reduced from as high as 39% in late 1980s in some places to just 6% in 2004 nationwide.

Clearly these interventions must have played a key role, but research had shown that they had not. Part of the problem was “too much epidemiology and too little social science” in the research, as pointed out by Maxine Ankrah, a sociologist. Paraphrased, her observation was what Albert Einstein had observed earlier about scientific research in general: “not everything that can be counted counts, but not everything that counts can be counted”. Kinsman argues that the AIDS policy in Uganda was heavily influenced by dominant international and national ideological contexts of the day, with pragmatism ultimately defining policy decisions. Evidence, including negative evidence, was simply interpreted to support prior policy positions.

Kinsman’s tacit conclusion is that politics and pragmatism can assume primacy over evidence and can determine successful policy. And that evidence-based policy making, especially where evidence is viewed as if it is from Randomised Controlled Trials, whose evidence is considered by some to be a gold-standard definition of any evidence, can be of limited use in a complex, socially and sexually dynamic epidemic such as HIV and AIDS.

But pragmatism and politics are not the only determinants of policy. As was the case in Uganda, two other key factors, perhaps typical of Africa (as Kinsman implies) have been critical in shaping policy. These are: big men (powerful western personalities backed by powerful institutions and aid money, and powerful national/ local politicians) and external aid money, with strings attached to certain policies.

Thus Kinsman’s title “…the making of an African success story…” is very telling. This sub-title is not about the failure of an elaborate and “rigorous” research to guide policy being typically African, no. That can happen in western and other developed countries too.

What Kinsman calls “African” refers to the role of “Big men”, a description, which western authors and critics derogatively use to describe Africa’s political leaders, but which Kinsman uses to include powerful characters from the west, who participated as policy actors in Africa. These big men from the west used aid money to create, shape and reshape the Anti-Retroviral Therapy (ART), condom and abstinence policies in Uganda with no reference to evidence at all.

The lesson the book teaches us all is that for a socially and sexually determined epidemic such as HV/AIDS, the linkages in collaboration, cooperation and investment of resources must be multilevel: at global, institutional, patient-doctor and individual patient levels. The MIT study in Uganda, like most donor- driven studies in Africa, was dominated by external actors who relegated local experts and subjects to subordinate roles.

The resultant resentment and resistance, albeit largely masked, to the well paid and fed foreign researchers were only predicable. Donor-funded research in Africa has become an industry far removed from the realities of daily suffering of the local people. Thus, it is common to find a world class, ultra-modern donor-funded research unit side by side with appalling and dilapidated health care infrastructure. The perennial excuse that the lack of technical capacity is responsible for the failure of African institutions and governments to absorb and translate research into policy could be overcome if a requisite proportion of donor research funding is invested to uplift this national “technical capacity”.

What cannot be missed from the book is the interesting but perfunctory ideological divide, where people (scientists, politicians and others) on either side of the divide argued furiously and endlessly about the relative roles of abstinence and faithfulness versus condoms in reducing Uganda’s HIV prevalence. The debate was inconclusive. The debate was prematurely and conveniently brought to a close through a widely distributed letter published in the Lancet in 2004, signed by over 200 scientists and prominent politicians, including President Museveni of Uganda. The letter advised that the debate be stopped and that it was time to move on.

Hon Dr Sam Agatre Okuonzi MP, MD, PhD is a member of Uganda’s parliament.