The Obama administration’s Department of Energy, led by Steven Chu, has taken a “portfolio” approach to easing the country into a future in which we’re less reliant on fossil fuels. Instead of betting on a single technology to solve all our problems, the DOE has been pushing a mix of renewables, efficiency measures, and nuclear power. After having licensed the first new nuclear plant in decades, the DOE has now reached agreements with companies that are trying to develop an alternative to these large facilities.
Rather than building large, Gigawatt-scale reactor buildings, several companies are developing what are termed small, modular nuclear reactors that produce a few hundred Megawatts of power. These are typically designed to be sealed units that simply deliver heat for use either directly or to generate electricity. When the fuel starts to run down, the reactors will be shipped back to a central facility for refueling. Since they will never be opened on site, many of the issues associated with large plants don’t come into play.
The new agreements, set up with Hyperion Power Generation, SMR, and NuScale Power, will give the companies access to the DOE’s Savannah River National Lab, with the intention of having them develop sites there for a test installation. Ultimately, the test installations are intended to provide data that will go into the licensing of these new designs. Chu, in announcing the agreement, stated, “We are committed to restarting the nation’s nuclear industry and advancing the next generation of these technologies.”
We’ll be running a feature on the future of nuclear power in the US early next week.Comments »
Marijuana. Legalize it or not? That’s the question many states are debating; whether to legalize the recreational or medical use of Mary Jane. The arguments that marijuana is an addictive substance and can act as a gateway to harder drugs are top reasons why legalization probably hasn’t happened yet. Daily Infographic
UAH Global Temperature Update for February 2012: -0.12 deg. C
March 2nd, 2012 by Roy W. Spencer, Ph. D.
The global average lower tropospheric temperature anomaly cooled a little more in February, 2012, again not unexpected for the current La Nina conditions in the tropical Pacific Ocean (click on the image for the full-size version):
Peter Ferrara, Contributor
About every four years, the United Nation’s Intergovernmental Panel on Climate Change (IPCC) produces a voluminous Assessment Report (AR) on the state of global warming science, such as it is. Two years after each AR, the IPCC produces an updating Interim Report.
In 2008, The Heartland Institute, headquartered in Chicago, began organizing international conferences of scientists from across the globe who want to raise and discuss intellectually troubling questions and doubts regarding the theory that human activity is causing ultimately catastrophic global warming. Six conferences have taken place to date, attracting more than 3,000 scientists, journalists, and interested citizens from all over the world.
(Full disclosure: As indicated by my nearby bio, I am a Heartland Senior Fellow, one of several affiliations I have with free-market think tanks and advocacy groups.)
In 2009, Heartland published Climate Change Reconsidered: The Report of the Nongovernmental International Panel on Climate Change (NIPCC). That 860-page careful, dispassionate, thoroughly scientific volume, produced in conjunction with the Science and Environmental Policy Project (SEPP) and the Center for the Study of Carbon Dioxide and Global Change, explored the full range of alternative views to the UN’s IPCC. Two years later, Heartland published the 418 page Climate Change Reconsidered: The 2011 Interim Report of the NIPCC, which updated the research regarding global warming and “climate change” since the 2009 volume.
Through these activities and more like them, Heartland has become the international headquarters of the scientific alternative to the UN’s IPCC, now providing full scale rebuttals to the UN’s own massive reports. Any speaker, any authority, any journalist or bureaucrat asserting the catastrophic danger of supposed man-caused global warming needs to be asked for their response to Climate Change Reconsidered. If they have none, then they are not qualified to address the subject.
This is the essential background to understanding “Fakegate,” the strange and still being written story of the decline and fall of political activist Peter Gleick, who had successfully engineered a long career posing as an objective climate scientist. Gleick, who has announced he is taking a “temporary, short-term leave of absence” as president of the Pacific Institute, also served until recently as chairman of the science integrity task force of the American Geophysical Union.
Gleick has publicly confessed that he contacted The Heartland Institute fraudulently pretending to be a member of the Board of Directors. Emails released by The Heartland Institute show that he created an email address similar to that of a board member and used it to convince a staff member to send him confidential board materials. Gleick then forwarded the documents to 15 global warming alarmist advocacy organizations and sympathetic journalists, who immediately posted them online and blogged and wrote about them.
Their expectation apparently was that the documents would be as embarrassing and damaging to the global warming skeptics as were the emails revealed in the “Climategate” scandal to the alarmist side. The Climategate revelations showed scientific leaders of the UN’s IPCC and global warming alarmist movement plotting to falsify climate data and exclude those raising doubts about their theories from scientific publications, while coordinating their message with supposedly objective mainstream journalists.
But the stolen Heartland documents exonerated, rather than embarrassed, the skeptic movement. They demonstrate only an interest at Heartland in getting the truth out on the actual objective science. They revealed little funding from oil companies and other self interested commercial enterprises, who actually contribute heavily to global warming alarmists as protection money instead. The documents also show how poorly funded the global warming skeptics at Heartland are, managing on a shoestring to raise a shockingly successful global challenge to the heavily overfunded UN and politicized government science.
As the Wall Street Journal observed on Feb. 21, while Heartland’s budget for the NIPCC this year totals $388,000, that compares to $6.5 million for the UN’s IPCC, and $2.5 billion that President Obama’s budget commits for research into “the global changes that have resulted primarily from global over-dependence on fossil fuels.” That demonstrates how an ounce of truth can overcome a tidal wave of falsehood.
Maybe that is why Gleick or one of his coconspirators felt compelled to go farther and composed a fake memo titled “Confidential Memo: 2012 Heartland Climate Strategy.” Whoever did it understood that a document composed on his computer and distributed online would contain markings demonstrating its source and confirming the forgery, so they printed it out and scanned it to hide its digital trail. The scanned document itself, however, contained evidence that allowed even amateur sleuths to trace it back to the Pacific Institute’s offices, as explained in an article by Megan McCardle, a senior editor for The Atlantic. (McCardle, incidentally, is highly sympathetic to global warming alarmism.)
The forged cover memo, not the actual stolen document, contains language mirroring Climategate. It discussed fabricated projects that are not activities of Heartland, and references a $200,000 Koch Foundation contribution for climate change activities that doesn’t exist. The Koch Foundation confirms that it gave Heartland only $25,000 in 2011, earmarked for health care policy projects and not climate change, an amount equal to only 0.5% of Heartland’s 2011 budget. By contrast, as the Journal also observed, the budget last year for the Natural Resources Defense Council was $95.4 million, and for the World Wildlife Fund $238.5 million.
Heartland President Joe Bast said in a statement on the episode, “The stolen documents were obtained by [a then] unknown person who fraudulently assumed the identity of a Heartland board member….Identity theft and computer fraud are criminal offenses subject to imprisonment. We intend to find this person and see him or her put in prison for these crimes.”
While I am not a scientist, and write primarily on economics, tax policy and budget issues, I have been fascinated over the years by Heartland’s work on climate change. I’ve attended the Heartland global warming conferences and read through the organization’s publications on the issue. What has fascinated me is how the objective, dispassionate scientific presentations so thoroughly demolish the intellectual case for catastrophic man-caused global warming. In contrast, as the comments to this article will no doubt show, the case for catastrophic global warming is no more than appeals to authority (“the United Nations says it’s true!”) or ad hominem attacks.
The bottom line is that the temperature records are not consistent with the theory that human “greenhouse” gas emissions are the primary cause of global warming. Those records do not show temperatures rising in conjunction with such ever rising emissions as the globe increasingly industrializes. Instead, the temperature record shows an up and down pattern that follows the pattern of natural influences on global temperatures, such as cyclical sunspots and solar flares, and cycles of ocean churning from warmer to colder temperatures and back, such as the Pacific Decadal Oscillation (PDO).
Moreover, the incorruptible and objective satellite temperature records show only modest warming starting in the late 1970s, which stopped roughly 10 years ago, with more recent declines. That is consistent with temperature proxy records found in nature, such as tree rings and ice cores. But that diverges significantly from the corruptible and subjectively compiled land based records, the repeated manipulation of which has prompted several prominent climate scientists to call for an investigation. Perhaps Gleick’s skills in falsification can be found more broadly among his colleagues.
In addition, the work of the UN’s IPCC is based on numerous climate models that attempt to project temperatures decades into the future. Those models are all based on the circular assumption that the theory of man caused global warming is true. As 16 world leading climate scientists recently reported in a letter to the Wall Street Journal,
“[A]n important gauge of scientific expertise is the ability to make successful predictions. When predictions fail, we say that the theory is ‘falsified’ and we should look for the reasons for the failure. Shown in the nearby graph is the measured annual temperature of the earth since 1989, just before the first report of the Intergovernmental Panel on Climate Change (IPCC). Also shown are the projections of the likely increase of temperature, as published in the Summaries of each of the four IPCC reports, the first in the year 1990 and the last in the year 2007.
“From the graph it appears that the projections [of the models] exaggerate, substantially, the response of the earth’s temperature to CO2 which increased by about 11% from 1989 through 2011. Furthermore, when one examines the historical temperature record throughout the 20th century and into the 21st, the data strongly suggest a much lower CO2 effect than almost all models calculate.”
Seems like the models have been falsified.
The likely reason for that failure is that while the models recognize that increased CO2 itself will not produce a big, catastrophic increase in global temperatures, the models assume that the very small amount of warming caused by increased CO2 will result in much larger temperature increases caused by positive feedbacks. The real, emerging science, as the Heartland publications indicate, is that the feedbacks are more likely to be offset by negative feedbacks, resulting in a much smaller net temperature change. Scientists have pointed out that much higher CO2 concentrations deep in the earth’s history, as shown by proxy records, did not result in catastrophic temperature increases, a very powerful rebuttal to the idea today’s relatively low CO2 levels could trigger catastrophic global warming.
The results of the latest, most advanced data collection also suggest that CO2 is not responsible for the modest global warming of the late 20th century. The UN models agree with established science that if human greenhouse gas emissions were causing global warming, there should be a hot spot of higher temperatures in the troposphere above the tropics, where collected concentrations would have the greatest effect, and the warming would show up first. This is known in the literature on climate science as “the fingerprint” for man caused global warming. But data from global weather satellites and more comprehensive weather balloons show no hotspot, and no fingerprint, which means no serious global warming due to human greenhouse gas emissions. QED.
Moreover, satellites also have been measuring the energy entering the earth’s atmosphere from the sun, and the energy escaping back out to space. If the theory of man caused global warming is correct, then the energy escaping back out should be less than the energy entering, as the greenhouse gases capture some of the energy in the atmosphere. But the satellite data show negligible difference.
The real cutting edge in climate science was publicly exposed recently in a book by one of the long time leaders of the German environmental movement, Fritz Vahrenholt, in his new book, The Cold Sun. The book expresses the growing concern among more careful real climate scientists, rather than political scientists, that trends in solar activity portend a return to the cold, limited agricultural output, and widespread disease of the Little Ice Age, or even a more full blown, overdue by historical standards, real ice age.
The consolation is that those threatening developments are still centuries away. In an interview with Spiegel magazine, titled “I Feel Duped on Climate Change,” Vahrenholt tells readers that the UN’s forecasts on the severity of climate change are exaggerated and supported by weak science. The American version would be Al Gore producing a movie with the title, “The Most Inconvenient Truth:I Was Wrong.”
The root of the global warming confusion is that the UN is not a disinterested party that can be trusted to compile and interpret the climate science on which the world’s policymakers can rely. The UN sees the theory of man caused catastrophic global warming as a tremendous opportunity for gaining the regulatory and taxation powers of a world government.
It is at least as self-interested on the subject as oil and gas companies. It has used its role as grand overseer of climate science to advance its own agenda. The result has been a great disservice to the scientific community and to policymakers. It fueled a global panic and mass delusion that has cost hundreds of billions or even trillions of dollars, and is likely to cost trillions more before it finally runs its course.
That is why Gleick’s Fakegate memo is actually a perfect metaphor for the entire fabrication of global warming. It and the entire Fakegate scandal provide a window, much like Climategate did, into the global warming movement, and what we see is ugly indeed. Peter Gleick’s misconduct is repeated a hundred times every day, in the same dishonest, cynical, and corrosive way, by global warming advocates around the world.
Fakegate is another reason why he U.S. should withdraw all funding and participation in the UN’s IPCC, and establish its own panel of scientists representing the full spectrum of views to study whether there is any real potential threat from man caused global warming. I nominate as the Chairman for that panel Richard Lindzen, the retiring Alfred P. Sloan Professor of Meteorology at MIT.
“DENVER (CBS4) – The Centers for Disease Control and Prevention and National Jewish Health in Colorado both have issued a warning about nasal washes after two people have died from using tap water to do their sinus rinse.
Health experts say it’s safe to use nasal washes. It’s not about the rinse, it’s about the water. They warn that a mixture from a faucet could be fatal.
Reading, writing — and sinus rinses. They’re part of the curriculum for some students at Kunsberg School at National Jewish Health. Saltwater nasal washes can help asthma and allergy sufferers.
The saline rinses are highly recommended at National Jewish for children and adults.
“I do them at home if I have a bad cold,” said Marie Fornof, Certified Infection Preventionist.
But Fornof says not to use tap water. It’s because of a brain-eating amoeba called Naegleria fowleri. It’s common in warm rivers and lakes, but if it travels up the nose to the brain it’s usually deadly….”Comments »
The Earth has a roughly 12 percent chance of experiencing an enormous megaflare erupting from the sun in the next decade. This event could potentially cause trillions of dollars’ worth of damage and take up to a decade to recover from.
Such an extreme event is considered to be relatively rare. The last gigantic solar storm, known as the Carrington Event, occurred more than 150 years ago and was the most powerful such event in recorded history.
That a rival to this event might have a greater than 10 percent chance of happening in the next 10 years was surprising to space physicist Pete Riley, senior scientist at Predictive Science in San Diego, California, who published the estimate in Space Weather on Feb. 23.
“Even if it’s off by a factor of two, that’s a much larger number than I thought,” he said.
Earth’s sun goes through an 11-year cycle of increased and decreased activity. During solar maximum, it’s dotted with many sunspots and enormous magnetic whirlwinds erupt from its surface. Occasionally, these flares burst outward from the sun, spewing a mass of charged particles out into space.
Small solar flares happen quite often whereas very large ones are infrequent, a mathematical distribution known as a power law. Riley was able to estimate the chance of an enormous solar flare by looking at historical databases and calculating the relation between the size and occurrence of solar flares.
The biggest solar event ever seen was the Carrington Event, which occurred on Sept. 1, 1859. That morning, astronomer Richard Carrington watched an enormous solar flare erupt from the sun’s surface, emitting a particle stream at the Earth traveling more than 4 million miles per hour.
When they hit the Earth’s atmosphere, those particles generated the intense ghostly ribbons of light known as auroras. Though typically relegated to the most northerly and southerly parts of the planet, the atmospheric phenomenon reached as far as Cuba, Hawaii, and northern Chile. People in New York City gathered on sidewalks and rooftops to watch “the heavens … arrayed in a drapery more gorgeous than they have been for years,” as The New York Times described it.
Auroras may be beautiful, but the charged particles can wreak havoc on electrical systems. At the time of the Carrington Event, telegraph stations caught on fire, their networks experienced major outages and magnetic observatories recorded disturbances in the Earth’s field that were literally off the scale.
In today’s electrically dependent modern world, a similar scale solar storm could have catastrophic consequences. Auroras damage electrical power grids and may contribute to the erosion of oil and gas pipelines. They can disrupt GPS satellites and disturb or even completely black out radio communication on Earth.
During a geomagnetic storm in 1989, for instance, Canada’s Hydro-Quebec power grid collapsed within 90 seconds, leaving millions without power for up to nine hours.
The potential collateral damage in the U.S. of a Carrington-type solar storm might be between $1 trillion and $2 trillion in the first year alone, with full recovery taking an estimated four to 10 years, according to a 2008 report from the National Research Council.
“A longer-term outage would likely include, for example, disruption of the transportation, communication, banking, and finance systems, and government services; the breakdown of the distribution of potable water owing to pump failure; and the loss of perishable foods and medications because of lack of refrigeration,” the NRC report said.
But such possibilities likely represent only the worst-case scenario, said Robert Rutledge, lead of the forecast office at the NOAA/National Weather Service Space Weather Prediction Center. The potential dangers might be significantly less, since power companies are aware of such problems and can take action to mitigate them.
For instance, companies may store power in areas where little damage is expected or bring on additional lines to help with power overloads. This is assuming, of course, that they are given enough warning as to the time and location of a solar storm’s impact on the Earth. Satellites relatively close to Earth are required to measure the exact strength and orientation of a storm.
“It’s like being able to see a cyclone coming but not knowing the wind speed until it hits your boat 50 miles off the coast,” Rutledge said.
Image: NASAComments »
Comments »Investing OverloadInformation overload leads us into some quite nasty investing behaviours: we ignore parts of the data presented, we favour recent, vivid and easily available information over other types and gravitate towards well presented arguments, even if they’re spurious, as described here by Thomas Moellers. Basically, the idea that more information is better than less when it comes to stock analysis is wrong, unless you have a well-honed mental model and proper support tools. One of the simpler ways is to build a checklist of items that you need to cross-check: the valuation fundamentals that you’re interested in, the competitive position, the previous track record of the directors, or whatever.
He is regarded as the most famous atheist in the world but last night Professor Richard Dawkins admitted he could not be sure that God does not exist.
He told the Archbishop of Canterbury, Dr Rowan Williams, that he preferred to call himself an agnostic rather than an atheist.
The two men were taking part in a public “dialogue” at Oxford University at the end of a week which has seen bitter debate about the role of religion in public life in Britain.
Last week Baroness Warsi, the Tory party chairman, warned of a tide of “militant secularism” challenging the religious foundations of British society.
The discussion, in Sir Christopher Wren’s Sheldonian Theatre, attracted attention from around the world.
As well as being relayed to two other theatres, it was streamed live on the internet and promoted fierce debate on the Twitter social network.
For an hour and 20 minutes the two men politely discussed “The nature of human beings and the question of their ultimate origin” touching on the meaning of consciousness, the evolution of human language – and Dr Williams’s beard.
For much of the discussion the Archbishop sat quietly listening to Prof Dawkins’s explanations of human evolution.
At one point he told the professor that he was “inspired” by “elegance” of the professor’s explanation for the origins of life – and agreed with much of it.
Prof Dawkins told him: “What I can’t understand is why you can’t see the extraordinary beauty of the idea that life started from nothing – that is such a staggering, elegant, beautiful thing, why would you want to clutter it up with something so messy as a God?”
Dr Williams replied that he “entirely agreed” with the “beauty” of Prof Dawkins’s argument but added: “I’m not talking about God as an extra who you shoehorn on to that.”
There was surprise when Prof Dawkins acknowledged that he was less than 100 per cent certain of his conviction that there is no creator.
The philosopher Sir Anthony Kenny, who chaired the discussion, interjected: “Why don’t you call yourself an agnostic?” Prof Dawkins answered that he did.
An incredulous Sir Anthony replied: “You are described as the world’s most famous atheist.”
Prof Dawkins said that he was “6.9 out of seven” sure of his beliefs.
“I think the probability of a supernatural creator existing is very very low,” he added.
He also said that he believed it was highly likely that there was life on other planets.
At one point he discussion strayed onto the theoretical question of whether a traditional cut throat razor could be described as a more complicated thing than an electric shaver.
There was laughter as the Archbishop said he would attempt an answer before adding: “Not that I know much about razors.”
During a wide-ranging discussion the Archbishop also said that he believed that human beings had evolved from non-human ancestors but were nevertheless “in the image of God”.
He also said that the explanation for the creation of the world in the Book of Genesis could not be taken literally.
“The writers of the Bible, inspired as I believe they were, they were nonetheless not inspired to do 21st Century physics,” he said.
When Prof Dawkins suggested that he believed the Pope took a rather more literal interpretation of the origins of humans, the Archbishop joked: “I will ask him some time.”
By Simon Carr
At a public meeting in the Commons, the climate scientist Professor Richard Lindzen of MIT made a number of declarations that unsettle the claim that global warming is backed by “settled science”. They’re not new, but some of them were new to me.
Over the last 150 years CO2 (or its equivalents) has doubled. This has been accompanied by a rise in temperature of seven or eight tenths of a degree centigrade.
The Intergovernmental Panel on Climate Change attributes half this increase to human activity.
Lindzen says: “Claims that the earth has been warming, that there is a Greenhouse Effect, and that man’s activity have contributed to warming are trivially true but essentially meaningless.”
He said our natural body temperature varies by eight tenths of a degree.
He showed a Boston newspaper weather graphic for a day – it had the actual temperature against a background of the highest and lowest recorded temperature for that day. The difference was as much as 60 degrees F.
When you double CO2 there’s a two per cent change in the “radiation budget”. Yet two billion years ago, the sun was 20 to 30 per cent dimmer – and the planet’s temperature was about the same.
The Al Gore graph showing CO2 and temperature rising and falling in tandem showed that the release of CO2 from the oceans was prompted by warming, not vice versa.
He gave us a slide with a series of familiar alarms – melting ice caps, disappearing icebergs, receding glaciers, rising sea levels. It was published by the US Weather Bureau in 1922.
And one further element of the consensus: there’s been no increase in temperature for 15 years.
He concluded with an exposition of science that, frankly, I didn’t follow. However, the reliability and explanatory power of climate models was satirised convincingly. And I found myself believing – or accepting the possibility – that warming would reduce rather than increase tropical storms.
He also said that the IPCC needs “positive feedback mechanisms” to justify anything above a one degree C increase in their predictions. But: “Observation points to small negative feedbacks.”
How to explain the procession of eminent opinion leaders – some even in our own Royal Society – who advance the tenets of catastrophic global warming? “It is science in the service of politics,” he said.
If Lindzen is right, we will never be able to calculate the trillions that have been spent on the advice of “scientists in the service of politics”.Comments »
We often worry about lying awake in the middle of the night – but it could be good for you. A growing body of evidence from both science and history suggests that the eight-hour sleep may be unnatural.
In the early 1990s, psychiatrist Thomas Wehr conducted an experiment in which a group of people were plunged into darkness for 14 hours every day for a month.
It took some time for their sleep to regulate but by the fourth week the subjects had settled into a very distinct sleeping pattern. They slept first for four hours, then woke for one or two hours before falling into a second four-hour sleep.
Though sleep scientists were impressed by the study, among the general public the idea that we must sleep for eight consecutive hours persists.
In 2001, historian Roger Ekirch of Virginia Tech published a seminal paper, drawn from 16 years of research, revealing a wealth of historical evidence that humans used to sleep in two distinct chunks.
His book At Day’s Close: Night in Times Past, published four years later, unearths more than 500 references to a segmented sleeping pattern – in diaries, court records, medical books and literature, from Homer’s Odyssey to an anthropological account of modern tribes in Nigeria.
Much like the experience of Wehr’s subjects, these references describe a first sleep which began about two hours after dusk, followed by waking period of one or two hours and then a second sleep.
“It’s not just the number of references – it is the way they refer to it, as if it was common knowledge,” Ekirch says.
During this waking period people were quite active. They often got up, went to the toilet or smoked tobacco and some even visited neighbours. Most people stayed in bed, read, wrote and often prayed. Countless prayer manuals from the late 15th Century offered special prayers for the hours in between sleeps.
And these hours weren’t entirely solitary – people often chatted to bed-fellows or had sex.
A doctor’s manual from 16th Century France even advised couples that the best time to conceive was not at the end of a long day’s labour but “after the first sleep”, when “they have more enjoyment” and “do it better”.
Ekirch found that references to the first and second sleep started to disappear during the late 17th Century. This started among the urban upper classes in northern Europe and over the course of the next 200 years filtered down to the rest of Western society.
By the 1920s the idea of a first and second sleep had receded entirely from our social consciousness.
When segmented sleep was the norm
- “He knew this, even in the horror with which he started from his first sleep, and threw up the window to dispel it by the presence of some object, beyond the room, which had not been, as it were, the witness of his dream.” Charles Dickens, Barnaby Rudge (1840)
- “Don Quixote followed nature, and being satisfied with his first sleep, did not solicit more. As for Sancho, he never wanted a second, for the first lasted him from night to morning.” Miguel Cervantes, Don Quixote (1615)
- “And at the wakening of your first sleepe You shall have a hott drinke made, And at the wakening of your next sleepe Your sorrowes will have a slake.” Early English ballad, Old Robin of Portingale
- The Tiv tribe in Nigeria employ the terms “first sleep” and “second sleep” to refer to specific periods of the night
He attributes the initial shift to improvements in street lighting, domestic lighting and a surge in coffee houses – which were sometimes open all night. As the night became a place for legitimate activity and as that activity increased, the length of time people could dedicate to rest dwindled.
In his new book, Evening’s Empire, historian Craig Koslofsky puts forward an account of how this happened.
“Associations with night before the 17th Century were not good,” he says. The night was a place populated by people of disrepute – criminals, prostitutes and drunks.
“Even the wealthy, who could afford candlelight, had better things to spend their money on. There was no prestige or social value associated with staying up all night.”
That changed in the wake of the Reformation and the counter-Reformation. Protestants and Catholics became accustomed to holding secret services at night, during periods of persecution. If earlier the night had belonged to reprobates, now respectable people became accustomed to exploiting the hours of darkness.
This trend migrated to the social sphere too, but only for those who could afford to live by candlelight. With the advent of street lighting, however, socialising at night began to filter down through the classes.
In 1667, Paris became the first city in the world to light its streets, using wax candles in glass lamps. It was followed by Lille in the same year and Amsterdam two years later, where a much more efficient oil-powered lamp was developed.
London didn’t join their ranks until 1684 but by the end of the century, more than 50 of Europe’s major towns and cities were lit at night.
Night became fashionable and spending hours lying in bed was considered a waste of time.
“People were becoming increasingly time-conscious and sensitive to efficiency, certainly before the 19th Century,” says Roger Ekirch. “But the industrial revolution intensified that attitude by leaps and bounds.”
Strong evidence of this shifting attitude is contained in a medical journal from 1829 which urged parents to force their children out of a pattern of first and second sleep.
“If no disease or accident there intervene, they will need no further repose than that obtained in their first sleep, which custom will have caused to terminate by itself just at the usual hour.
“And then, if they turn upon their ear to take a second nap, they will be taught to look upon it as an intemperance not at all redounding to their credit.”
Today, most people seem to have adapted quite well to the eight-hour sleep, but Ekirch believes many sleeping problems may have roots in the human body’s natural preference for segmented sleep as well as the ubiquity of artificial light.
This could be the root of a condition called sleep maintenance insomnia, where people wake during the night and have trouble getting back to sleep, he suggests.
The condition first appears in literature at the end of the 19th Century, at the same time as accounts of segmented sleep disappear.
“For most of evolution we slept a certain way,” says sleep psychologist Gregg Jacobs. “Waking up during the night is part of normal human physiology.”
The idea that we must sleep in a consolidated block could be damaging, he says, if it makes people who wake up at night anxious, as this anxiety can itself prohibit sleeps and is likely to seep into waking life too.
Stages of sleep
Every 60-100 minutes we go through a cycle of four stages of sleep
- Stage 1 is a drowsy, relaxed state between being awake and sleeping – breathing slows, muscles relax, heart rate drops
- Stage 2 is slightly deeper sleep – you may feel awake and this means that, on many nights, you may be asleep and not know it
- Stage 3 and Stage 4, or Deep Sleep – it is very hard to wake up from Deep Sleep because this is when there is the lowest amount of activity in your body
- After Deep Sleep, we go back to Stage 2 for a few minutes, and then enter Dream Sleep – also called REM (rapid eye movement) sleep – which, as its name suggests, is when you dream
In a full sleep cycle, a person goes through all the stages of sleep from one to four, then back down through stages three and two, before entering dream sleep
Russell Foster, a professor of circadian [body clock] neuroscience at Oxford, shares this point of view.
“Many people wake up at night and panic,” he says. “I tell them that what they are experiencing is a throwback to the bi-modal sleep pattern.”
But the majority of doctors still fail to acknowledge that a consolidated eight-hour sleep may be unnatural.
“Over 30% of the medical problems that doctors are faced with stem directly or indirectly from sleep. But sleep has been ignored in medical training and there are very few centres where sleep is studied,” he says.
Jacobs suggests that the waking period between sleeps, when people were forced into periods of rest and relaxation, could have played an important part in the human capacity to regulate stress naturally.
In many historic accounts, Ekirch found that people used the time to meditate on their dreams.
“Today we spend less time doing those things,” says Dr Jacobs. “It’s not a coincidence that, in modern life, the number of people who report anxiety, stress, depression, alcoholism and drug abuse has gone up.”
So the next time you wake up in the middle of the night, think of your pre-industrial ancestors and relax. Lying awake could be good for you.Comments »
Tesla Motors’ lineup of all-electric vehicles — its existing Roadster, almost certainly its impending Model S, and possibly its future Model X — apparently suffer from a severe limitation that can largely destroy the value of the vehicle. If the battery is ever totally discharged, the owner is left with what Tesla describes as a “brick”: a completely immobile vehicle that cannot be started or even pushed down the street. The only known remedy is for the owner to pay Tesla approximately $40,000 to replace the entire battery. Unlike practically every other modern car problem, neither Tesla’s warranty nor typical car insurance policies provide any protection from this major financial loss.
Despite this “brick” scenario having occurred several times already, Tesla has publicly downplayed the severity of battery depletion risk to both existing owners and future buyers. Privately though, Tesla has gone to great lengths to prevent this potentially brand-destroying incident from happening more often, including possibly engaging in GPS tracking of a vehicle without the owner’s knowledge.
Read the rest here.Comments »
When Your DNA Dings Your ROI
Why are some people more prone to stupid financial behavior than others?
Several of the most common and costly mistakes that investors make appear to be encoded in our genes.
So argues a new research paper from two finance professors, Henrik Cronqvist of Claremont McKenna College and Stephan Siegel of the W.P. Carey School of Business at Arizona State University.
Cronqvist and Siegel based their study on two sets of remarkable data.
In Sweden, until recently, the government collected details about each holding in taxpayers’ investment portfolios. Cronqvist and Siegel could thus track the investment portfolios of individual Swedes, as well as any of their sales of securities, between 1999 and 2007. (None of the investors’ names were disclosed to the researchers.)
What’s more, the Swedish government enters all twins in a national register at birth. Cronqvist and Siegel identified more than 30,000 twins with investment portfolios – including more than 9,200 identical twins – and then studied how much their investing behavior varied. Bear in mind that identical twins are genetically a perfect match, while fraternal twins share similar but not identical genetic profiles; the researchers also compared twins against a random sample of nontwins as an experimental control.
The intuition is obvious: If thousands of people who are genetically identical exhibit the same behavior more strongly than thousands of nonidentical people do, then it’s plausible to attribute the variation in behavior to their genetic makeup.
Cronqvist and Siegel studied five prevalent investing mistakes or “biases”:
- Inadequate diversification (measured as a preference for investments based in Sweden)
- Excessive trading
- The reluctance to sell at a loss
- Chasing hot past performance
- Trying to get rich quick.
Cronqvist and Siegel found, across the twins in their sample, that genetic variation explained between one-quarter and nearly one-half of the extent to which investors suffered from these biases. Inadequate diversification scored the highest, with genetic effects explaining 45.3% of the variation across investors. At the low end, 25.7% of the degree to which investors traded too much was explained by their genetic variation.
Of course,we aren’t just abject slaves to our double helix when we invest. As intriguing as these new data are, they still explain less than half of what makes investors tick.
Read the rest here.Comments »
“(NaturalNews) The idea of cholesterol creating cardiac problems has caused obsessive cholesterol count blood testing for decades. Another outcome of this scare was obsessively avoiding fat, especially saturated fats.
The food industry responded with low and no fat foods from milk to cottage cheese and more. Processed foods promoted their low or no fat contents as though they were the healthiest foods in the freezer.
Healthy fats such as coconut oil and palm oil were spurned and replaced by very unhealthy trans-fat, processed and heated cooking oils. Relatively healthy whole butters were replaced by plastic margarines.
However, this myth of cholesterol dangers lurking in saturated fats waiting to clog your arteries and cause you to die of cardiac arrest is beginning to unravel.
Unraveling the myth of cholesterol
A meta-analysis of properly performed previous studies on heart health and saturated fats concluded there was no association between cardiac issues and saturated fats. This was published in the American Journal of Clinical Nutrition (AJCN) on January 13th, 2010. (1)
Meta-analysis is a statistical method of proving or disproving varied epidemiological studies within a set topic. The AJCN meta-analysis covered studies involving 350,000 subjects who were followed for 5 to 23 years.
The trend set by the saturated fat high cholesterol disinformation a few decades ago has resulted in many Americans eating less fat and showing lower blood cholesterol levels. Yet, heart disease rates have continued to rise along with diabetes, pre-diabetes and obesity. (1)
Dr. William Davis explains in his article “A Headline You Will Never See: 60 Year Old Man Dies of Cholesterol” that cholesterol doesn’t kill “any more than a bad paint job on your car could cause a fatal car accident.” (1)
He explains the cause of most heart attacks and coronary problems is atherosclerotic plaque in the coronary arteries, which can build up and rupture or clog the arteries. He goes on to describe other factors that can cause plaque ruptures, including inflammatory pneumonia.
Though there can be some cholesterol in the plaque, cholesterol itself is waxy and pliable. Cholesterol is important for brain cells, nerves and other cellular structural components. Calcium deposits (calcification) in artery interiors are much worse components of plaque. It belongs in your bones and not in your arteries. Vitamin K2 helps transport calcium out of your blood and into your bones.
Dr. Davis recommends avoiding cholesterol panels for heart health concerns and opting for a measure of coronary atherosclerotic plaque.
The scam continues despite overwhelming contradictory evidenceComments »
The standard-bearer for this new class of exoplanet is called GJ 1214b, which astronomers first discovered in December 2009. New observations by NASA’s Hubble Space Telescope suggest that GJ 1214b is a watery world enshrouded by a thick, steamy atmosphere.
“GJ 1214b is like no planet we know of,” study lead author Zachory Berta of the Harvard-Smithsonian Center for Astrophysics in Cambridge, Mass., said in a statement. “A huge fraction of its mass is made up of water.”
Adding to the diversity
To date, astronomers have discovered more than 700 planets beyond our solar system, with about 2,300 more “candidates” awaiting confirmation by follow-up observations.
These alien planets are a diverse bunch. Astronomers have found one planet as light and airy as Styrofoam, for example, and another as dense as iron. They’ve discovered several alien worlds that orbit two suns, like Luke Skywalker’s home planet of Tatooine in the “Star Wars” films.
But GJ 1214b, which is located 40 light-years from Earth in the constellation Ophiuchus (The Serpent Bearer), is something new altogether, researchers said.
This so-called “ super-Earth ” is about 2.7 times Earth’s diameter and weighs nearly seven times as much as our home planet. It orbits a red-dwarf star at a distance of 1.2 million miles (2 million kilometers), giving it an estimated surface temperature of 446 degrees Fahrenheit (230 degrees Celsius) — too hot to host life as we know it.
Hubble watched as GJ 1214b crossed in front of its host star, and the scientists were able to determine the composition of the planet’s atmosphere based on how it filtered the starlight.
“We’re using Hubble to measure the infrared color of sunset on this world,” Berta said. “The Hubble measurements really tip the balance in favor of a steamy atmosphere.”
Berta and his colleagues report their results online in the Astrophysical Journal.
A watery world
Since astronomers know GJ 1214b’s mass and size, they’re able to calculate its density, which turns out to be just 2 grams per cubic centimeter (g/cc). Earth’s density is 5.5 g/cc, while that of water is 1 g/cc.
GJ 1214b thus appears to have much more water than Earth does, and much less rock. The alien planet’s interior structure is likely quite different from that of our world.
“The high temperatures and high pressures would form exotic materials like ‘hot ice’ or ‘superfluid water,’ substances that are completely alien to our everyday experience,” Berta said.
GJ 1214b probably formed farther out from its star, where water ice was plentiful, and then migrated in to its current location long ago. In the process, it would have experienced more Earth-like temperatures, but how long this benign phase lasted is unknown, researchers said.
Because GJ 1214b is so close to Earth, it’s a prime candidate for study by future instruments. NASA’s James Webb Space Telescope, which is slated to launch in 2018, may be able to get an even better look at the planet’s atmosphere, researchers said.