Category Archives: Science

Kiwi scientist helps tackle elusive ‘Element 118’

A top Kiwi scientist has helped characterise an intriguing and relatively new chemical element, which had proven a headache to study.

First synthesized as a single atom in 2002 at the Joint Institute for Nuclear Research (JINR) in Russia, oganesson, or Og, is the only noble gas which doesn’t naturally occur and must be synthesized in experiments.

Widely known as Element 118, Og is also one of only two elements to be named after a living scientist, nuclear physicist Yuri Oganessian.

However, studying one of the heaviest elements with the highest atomic number to ever be synthesized – 118 – has been no easy task.

Oganesson is radioactive and extremely unstable with a half-life of less than a millisecond, making it impossible to examine by chemical methods.

This meant computing its electronic structure is the next best thing, which was in itself a formidable task.

Teams led by Massey University’s Distinguished Professor Peter Schwerdtfeger, of the New Zealand Institute for Advanced Study, and nuclear physicist Witold Nazarewicz of Michigan State University in the US, were able to make the calculations.

“Calculations are the only way to get at its behaviour with the tools that we currently have, and they have certainly provided some interesting findings,” Schwerdtfeger said.

The work suggests that oganesson electrons weren’t confined to distinct orbitals and are distributed evenly.

Massey University's Distinguished Professor Peter Schwerdtfeger. Photo / Supplied

Massey University’s Distinguished Professor Peter Schwerdtfeger. Photo / Supplied

“On paper, we thought that it would have the same rare gas structure as the others in this family.

“In our calculations however, we predict that oganesson more or less loses its shell structure and become a smear of electrons.”

Further, it was thought to be a gas under normal conditions, but is now predicted to be a solid according to newest research from the Massey group.

“Oganesson is quite different to the other rare gas atoms, as its shells are barely visible in an electron localization function plot and has been smeared to near-invisibility,” Schwerdtfeger added.

“Oganesson comes quite close to the limiting case of a Fermi gas.”

The team also calculated the structure of protons and neutrons inside the nucleus, which indicated a smeared-out structure for the neutrons as well.

The protons however retained some shell-like ordering.

Schwerdtfeger is one of the world’s leading theoretical chemists, winning New Zealand’s highest honor in science, the Rutherford Medal, in 2014, and remaining the most highly cited chemist and physicist in the country at his age.

His multi-disciplinary research has provided a deep insight into how atoms and molecules interact at the quantum level.

Santa Rosa & Northern CA Fires DEFY THE LAWS of PHYSICS (Where’d the houses go??)

for eyes that can see, there is something wrong here, the melting/ burning point of common household materials like glass (2600℉) and stainless steel (2800℉) are DOUBLE the temperature of house and forest fires (1100℉) … and you might expect Rod Sterling to pop out at any moment in this footage..bc it’s like the Twilight Zone!!…………

Monstrous 1-ton ocean sunfish caught in Russia’s far east, thrown to the bears (PHOTO)

Monstrous 1-ton ocean sunfish caught in Russia's far east, thrown to the bears (PHOTO)

The group of Sakhalin fishermen did not expect to find the Lovecraftian monster in their nets as they took to the sea for smaller fry on Saturday. The enormous fish, considered a delicacy in some cultures, proved to be a dead weight as it didn’t make it to the shore fresh.

Russian fishermen pulled a 1,100 kg ocean sunfish which got stuck in their net in the waters near Iturup island, according to Sakhalin info. Also known as moonfish and mola, the bizarre-looking creature is the heaviest bony fish on Earth.

“There has been no such specimen that I can remember; there is the dolphinfish, also known for its size and reaching 1.5 meters, but I have never seen a sunfish weighing more than a ton here before,”  said fisherman Artur Balkarov.

READ MORE: Monster from the deep: Mysterious & creepy fanged sea creature washes up in Texas (PHOTOS)

By the time the fascinating behemoth was brought ashore three days later, it had begun rotting. The fishermen had no choice but take it to a dumping site where locals bring their fishing waste for wild animals, including bears.

While trade in moonfish meat is prohibited in the EU, a number of Asian countries, such as Japan and Taiwan, see it as delicious treat and handy ingredient in traditional medicine.

The brain’s wiring as you’ve never seen it before

Scientists at the Cardiff University Brain Research Imaging Centre have produced the world’s most detailed scans of the brain’s internal wiring. These wires, or axons, carry electrical signals that encode information around the brain.

Axons labelled green travel back to front, red run left and right, and blue go up and down. To make these images, a team of researchers and engineers used a powerful type of MRI scanner, one of only three in the world. It measures the motion of water molecules along axons. Scientists then combined this data with advanced mathematical models to infer axon orientation and density. They visualised the results with a technique called cinematic rendering, which is used in animated blockbusters to create realistic hairs and grass.

They also scanned the brain of a woman with multiple sclerosis and spotted an area of unusually low axon density. This finding may give clues to what causes this and other brain conditions.

Santa Rosa & Northern CA Fires DEFY THE LAWS of PHYSICS (Where’d the houses go??)

for eyes that can see, there is something wrong here, the melting/ burning point of common household materials like glass (2600℉) and stainless steel (2800℉) are DOUBLE the temperature of house and forest fires (1100℉) … and you might expect Rod Sterling to pop out at any moment in this footage..bc it’s like the Twilight Zone!!…………


Delingpole: Man-Made Climate Catastrophe Is a Myth, More Studies Confirm

climate protest

From the world of science – as opposed to grant-troughing junk science – two more studies confirming that the man-made global warming scare is a myth.

One, a study by Scafetta et al, published in International Journal of Heat and Technology, confirms that the “Pause” in global warming is real – and that “climate change” is much more likely the result of natural, cyclical fluctuations than man-made CO2 emissions.


The period from 2000 to 2016 shows a modest warming trend that the advocates of the anthropogenic global warming theory have labeled as the “pause” or “hiatus.” These labels were chosen to indicate that the observed temperature standstill period results from an unforced internal fluctuation of the climate (e.g. by heat uptake of the deep ocean) that the computer climate models are claimed to occasionally reproduce without contradicting the anthropogenic global warming theory (AGWT) paradigm. In part 1 of this work, it was shown that the statistical analysis rejects such labels with a 95% confidence because the standstill period has lasted more than the 15 year period limit provided by the AGWT advocates themselves. Anyhow, the strong warming peak observed in 2015-2016, the “hottest year on record,” gave the impression that the temperature standstill stopped in 2014.

Herein, the authors show that such a temperature peak is unrelated to anthropogenic forcing: it simply emerged from the natural fast fluctuations of the climate associated to the El Niño–Southern Oscillation (ENSO) phenomenon. By removing the ENSO signature, the authors show that the temperature trend from 2000 to 2016 clearly diverges from the general circulation model (GCM) simulations.

Thus, the GCMs models used to support the AGWT are very likely flawed. By contrast, the semi-empirical climate models proposed in 2011 and 2013 by Scafetta, which are based on a specific set of natural climatic oscillations believed to be astronomically induced plus a significantly reduced anthropogenic contribution, agree far better with the latest observations.

Note also that it says the computer-modelled predictions of climate doom relied on by all global warming alarmists to support their thesis are wrong.

The second study, by Hodgkins et al, published in the Journal of Hydrology, concerns flooding in North America and Europe.

What it shows is that, contrary to the claims often made by climate alarmists, there has been NO increase in flooding due to “global warming” or “climate change.”

Flooding events, it shows, have more to do with chance than any noticeable long term trend. It finds no link between flooding and “global warming.”


Concern over the potential impact of anthropogenic climate change on flooding has led to a proliferation of studies examining past flood trends. Many studies have analysed annual-maximum flow trends but few have quantified changes in major (25–100 year return period) floods, i.e. those that have the greatest societal impacts. Existing major-flood studies used a limited number of very large catchments affected to varying degrees by alterations such as reservoirs and urbanisation. In the current study, trends in major-flood occurrence from 1961 to 2010 and from 1931 to 2010 were assessed using a very large dataset (>1200 gauges) of diverse catchments from North America and Europe; only minimally altered catchments were used, to focus on climate-driven changes rather than changes due to catchment alterations. Trend testing of major floods was based on counting the number of exceedances of a given flood threshold within a group of gauges. Evidence for significant trends varied between groups of gauges that were defined by catchment size, location, climate, flood threshold and period of record, indicating that generalizations about flood trends across large domains or a diversity of catchment types are ungrounded. Overall, the number of significant trends in major-flood occurrence across North America and Europe was approximately the number expected due to chance alone. Changes over time in the occurrence of major floods were dominated by multidecadal variability rather than by long-term trends. There were more than three times as many significant relationships between major-flood occurrence and the Atlantic Multidecadal Oscillation than significant long-term trends.

A few take-home points from these studies.

One, they explode – yet again – the myth that there is a consensus among scientists about catastrophic man-made climate change. In fact, as I reported earlier this year, there are dozens of papers produced every year by reputable, honest scientists which call into question the great man-made climate change scare.

Two, the alarmists hate it when you point this out. After my Breitbart piece Global Warming is a Myth, Say 58 Scientific Papers in 2017an alarmist website published a supposed expert rebuttal by leading climate scientists. The problem was, of course, that all the “experts” involved were members of the alarmist cabal who pal-review one another’s papers and who ruthlessly shut out of the debate any scientists who dare to disagree with them.

Three, the alarmists know the jig is up and have done for some time. But in the interests of damage limitation they’re trying to drip out their corrections (aka admissions of error) slowly – and on their terms – rather than allow any hated skeptics (like yours truly) the chance to crow.

This is what happened after that bombshell paper released in Nature Geoscience last month by leading climate alarmists including Oxford University’s Myles Allen. Buried beneath its misleading and dull abstract was an extraordinary admission: that their computer models had wildly overestimated the effects of carbon dioxide on global warming.

Which in turn means, of course, that the entire AGW scare (which relies above all else on those computer models) is bunk and that really – “Big Mac meal with Coke, 5 chicken select, curry dip and two large teas, thanks Myles” – it’s about time these taxpayer-funded Chicken Littles did something useful with their lives for a change.

But when journalists pointed this out, the alarmists responded by attacking the journalists, supposedly for having misrepresented their paper. Yeah right. Look guys, if a dodgy company – say Enron Inc – releases its annual report with a summary that says: “Good news. Our profits are up again and our prospects are better than ever” but on closer examination of the company accounts this turns out to be drivel, it is not the job of journalists to report that rosy executive summary, however much Enron/Global Warming Inc might prefer it.

Let’s get something absolutely clear about this global warming debate. (I may have mentioned this before but it’s worth restating). Anyone at this late stage who is still on the alarmist side of the argument is either a liar, a cheat, a crook, a scamster, an incompetent, a dullard, a time-server, a charlatan or someone so monumentally stupid that they really should be banned by law from having an opinion on any subject whatsoever.

And that’s just the scientists.

The parasitic industry profiting from all that junk-science nonsense the alarmists keep pumping into the ether is even worse.

Just one brief example. The other week, the British press was chock full of stories about this incredible advance which had been made in the offshore wind turbine industry whereby costs had fallen so markedly that suddenly those sea-based bat-chomping, bird-slicing, whale-killing eco-crucifixes were more competitive than ever before. There was barely a newspaper that didn’t fall for this “good news” propaganda story.

The story had been heavily promoted by a number of vested interests: a “coalition of companies and civil society organisations” (including Dong Energy, GE, ScottishPower Renewables, Siemens Gamesa, SSE, Vattenfall, Greenpeace, Marine Conservation Society, and WWF.”

Look at that list and marvel and the range and influence and financial muscle of those co-conspirators. Mighty, global NGOs and vast industrial conglomerates with a combined income running into the many billions. Environmentalism is not some gentle, bunny-hugging Mom and Pop operation. It’s a ginormous, many tentacled, spectacularly greedy and corrupt Green Blob.

And guess what? That story – repeated unquestioningly by the MSM, crowed about by the BBC – was horse shit. Actually, it was worse than that: it was fox shit, which – as anyone who has smelt it will know – is an altogether more noisome, pungent, vile substance.

Now the Global Warming Policy Foundation has reported these liars to the Advertising Standards Authority.

And Paul Homewood has done the numbers and worked out that actually, far from being a bargain, this is yet another massive taxpayer rip off.

Never forget this next time you hear anyone bleating about Trump doing something sensible like pulling out of the Paris climate accord or scrapping the Clean Power Plan. The global warming scare is the biggest scam in the history of the world. It cannot be killed off soon enough.


Yes, the US Government Has Experimented With Controlling Hurricanes

hurricane irma

The 2017 hurricane season has wrought more damage on the Caribbean and the Gulf Coast of the United States than any season in the last decade. Tropical Storm Harvey smashed into the Gulf, temporarily swallowing Houston and other low lying areas. Meanwhile, Hurricane Irma caused millions of dollars in damage to Florida, Puerto Rico, and other Caribbean islands, leaving millions without power and water.

Along with the gusts of wind, property damage, and loss of life, this hurricane season also sparked a wide range of conspiracy theories regarding the possibility that the U.S. government or some other government could be manipulating the weather to strengthen hurricanes. These theories range from the idea that planes were spraying before and during the storms in order to help them grow and/or direct them at specific targets to others who believe the High Frequency Active Auroral Research Program (HAARP), or a similar device, was used to heat up the ionosphere and “charge” the storms to cause more destruction.

There are dozens of YouTube channels where individuals focus specifically on weather manipulation and modification. They claim to have the expertise to study radar images and determine whether artificial elements were added to developing hurricanes. If you are interested in that type of research, see this. However, I will not be addressing the issue of whether or not the U.S. is currently manipulating hurricanes. I do not have the technical background to accurately report in that area. Instead, I will focus on the history of weather modification as it pertains to hurricanes. If you have limited knowledge on weather modification — or, perhaps, you even think it is a hoax — I encourage you to read on. If you are familiar with the history or science of weather modification, I also encourage you to read on, as I have included details I have not seen covered elsewhere.

The theories surrounding possible hurricane manipulation have grown to the point that the “mainstream” media has been forced to respond. In early September, released an article titled “No, We Can’t Control Hurricanes from Space,” which attempted to debunk these theories. “The short answer is that we can’t control weather at any scale, and hurricanes are no exception,” wrote. Nevertheless, if we go back to 2015, we find an article from Popular Mechanics matter-of-factly stating, “We Could Reduce the Number of Hurricanes By Injecting Particles Into the Atmosphere.” The article discusses research published in the journal Proceedings of the National Academy of Sciences that concluded sulfates could be spread into the Earth’s stratosphere to “dampen” hurricanes over the next 50 years. The scientists do not claim to be able to “steer” or direct hurricanes, but they do say they have the power to slow them down by 50 percent.

A (Brief) History of Weather Modification

Despite these modest statements, the history of weather modification and the desire to manipulate hurricanes has a history stretching back at least 100 years to people often known as “rainmakers.” The rainmakers were men who studied “pluviculture,” or the act of attempting to artificially create rain, usually to fight drought. Most of these men were seen as scammers, traveling salesman pitching fantasy ideas to the gullible about creating rain. However, one of the most successful rainmakers was Charles Hatfield. Born in 1875, Hatfield migrated to Southern California and studied pluviculture, eventually creating a secret mixture of 23 chemicals he said could induce rain. Using his secret mixture, Hatfield successfully created storms several times and began to find work creating rain.

In 1915, Hatfield began working for the San Diego city council to produce enough rain to fill the Morena Dam reservoir. Hatfield was told he would receive $10,000 once the reservoir was filled. In early January 1915, rain began pouring down over the dam, growing heavier with each day that passed. On January 20, the dam broke, causing mass flooding that led to an estimated 20 deaths. Hatfield told the press he was not to blame, stating the city should have taken precautions. The city refused to pay Hatfield unless he also accepted liability for the damage and deaths. After legal battles ensued, Hatfield was absolved of any wrongdoing when the storm was officially ruled an act of God. However, due to the ruling, Hatfield’s work was seen as a failure, and he was (mostly) relegated to forgotten pages of history.

Beginning in 1947, General Electric, the U.S. Army Corps, the U.S. Air Force, and the Office of Naval Research began attempting to modify hurricanes. The main scientist behind the research was a Nobel Peace Prize-winning chemist named Irving Langmuir. While working as a chemist with GE, Langmuir began to hypothesize about manipulating hurricanes. In October 1947, the researchers decided to seed a hurricane with ice pellets. The hurricane had been drifting to the northeast into the Atlantic Ocean, but after being seeded, the hurricane grew stronger and crashed into Savannah, Georgia.

There was a public backlash and threats of lawsuits against Langmuir and the research team. Despite Langmuir claiming responsibility for affecting the storm, researchers concluded his work did not cause the change in direction. The lawsuits were dropped, but Langmuir continued to work on weather modification. It’s not hard to imagine the U.S. military and General Electric wanting to distance themselves from the destruction by calling their own project a failure. Interestingly, Wikipedia references a 1965 article from the Sun-Sentinel titled “Betsy’s Turnaround Stirs Big Question.” (Betsy was another hurricane reported to have been modified.) The article, written more than a decade later, apparently reports that a hurricane in 1947 “went whacky” and that “[t]welve years later it was admitted the storm had in fact been seeded.” Unfortunately, there is not a digital copy of the article available to verify the claims on Wikipedia.

Most reports on Project Cirrus claim the 1947 hurricane was the only attempt, but a look at records maintained by General Electric indicate there were several more tests on hurricanes. The records list Albuquerque, New Mexico; Mt. Washington, New Hampshire; Burbank, California; and several locations in New York as test sites for cloud seeding with silver iodide. Another section lists cloud seeding attempts in Honduras by Langmuir. The report stated:

“In 1948 and 1949, Langmuir visited Honduras, Guatemala, and Costa Rica to study tropical cloud formations, and particularly to learn what was being done by Joe Silverthorne, a commercial cloud seeder, in seeding clouds for the United Fruit Company. The work was being

Conducted for the purpose of testing out the possibility of controlling rainfall, and particularly in the hope of stopping blow-downs that result from winds associated with thunderstorms, which occasionally destroy large stands of fruit trees.”

The GE report is well worth your time and attention. It details the contracts between the U.S. military and GE, as well as other historical details regarding GE’s attempts to modify weather.

More recent examples of attempts at weather modification involve programs known as Project Stormfury, Project Cirrus, and Operation Popeye. Project Stormfury was a U.S. government project aimed at weakening Tropical Cyclones by seeding them with silver iodide. From 1961 to 1971, researchers sprayed silver iodide into hurricanes, believing the supercooled water might disrupt the structure of the storm. Officially, the project has been ruled a failure, but it was not the only attempt to manipulate weather in this time period.

One example of seeding a hurricane that may have actually been successful was Hurricane Betsy in 1965. As the Sun-Sentinel reported in 1965:

“Hurricane Betsy was building strength; it looked like it was aiming for South Carolina, posing no threat to South Florida. But on Saturday, Sept. 4, the storm whirled to a stop, about 350 miles east of Jacksonville. When Betsy started moving again on Sunday, she had changed directions. The storm plowed through the Bahamas Monday night, then mauled South Florida a day later.”

Officially, the U.S. government says Hurricane Betsy was designated to be seeded but that apparently, that decision was changed at the last moment. The National Oceanic and Atmospheric Administration recalled the event on the 50th anniversary:

“Dr. Joanne Simpson, Project Director, had ordered the fleet of Navy and Weather Bureau research aircraft to deploy to Puerto Rico on August 28th.  Over the next two days, the planes monitored the storm’s slow progress toward the designated part of the ocean where they could carry out their weather modification experiments.  By August 31st, Betsy had just managed to crawl into the area as a hurricane, so a seeding experiment was scheduled for the next day.  The first aircraft had already taken off from Roosevelt Roads Naval Air Station, PR the morning of September 1st when word came from the National Hurricane Center that overnight Betsy had completed a loop in its track and was now headed southward and out of the allowed seeding area.  The seeding experiments were called off and the mission changed to a ‘dry run’, where the same patterns were flown but no silver iodide was released into the storm.  Unfortunately, no one informed the press which had been alerted to STORMFURY’s seeding intentions the previous day.”

The press and the public blamed the researchers for the 138 mph winds and destruction from Betsy. Congress was skeptical of further programs until the researchers were able to smooth things over. “I was totally unaware of the level of emotion and hostility that was directed against anything that had to do with cloud seeding,”  Joanne Simpson, one time head of Project Stormfury, told NASA.  Simpson would go on to work on a cloud-seeding project called FACE (the Florida Area Cumulus Experiment).

With Hurricane Betsy and the 1947 hurricane, we have two situations where cloud-seeding was reportedly happening, and we have two disastrous outcomes. In both situations, the scientists claimed no responsibility, and no one was held accountable. Again, is it that hard to imagine a government official (or a scientist under government contract) lying about the nature of the work? Especially if that work resulted in millions of dollars in property damage and deaths?

The NOAA even acknowledges that “[s]ince no one at Project STORMFURY nor in the Weather Bureau had advised the public or the press that the actual seeding of the storm had been scrubbed, many people believed it had been carried out and the link to its odd path seemed plausible.  Although attempts to clarify the facts about STORMFURY and Betsy were made after the fact, the notion of a link persists to the present.”

Weather as a Weapon of War

Operation Popeye was a now-declassified attempt by the U.S. military to modify the weather in Southeast Asia from 1967 to 1972. The U.S. military conducted cloud-seeding operations over the Ho-Chi Minh trail during the Vietnam War. Cloud-seeding typically involves planes flying overhead and spraying silver iodide into the air. The goal in Vietnam was to extend monsoon season and flood out the enemy. It was reported that the operations were “tightly controlled” by Henry Kissinger, who was serving as Secretary of State at the time. Operation Popeye is the first modern example (that we know of) where attempts were made to use weather as a weapon of war.

In April 1976, the New York Times wrote about the situation and the challenges weather modification created:

“Can a nation that tampers with natural balances deny responsibility for what follows? This question, together with recognition that United States policy condemns warfare aimed at civilians, prompted Senator Claiborne Pell in 1973 to introduce a resolution calling for an international treaty to prohibit environmental warfare ‘or the carrying out of any research or experimentation directed thereto.’ The Senate voted 82 to 10 to approve the resolution, which lacks force of law.”

The international treaty referred to is the Environmental Modification Treaty implemented and signed by the United States and other nations to halt global weather modification in the wake of the bad publicity. The Times noted:

“Unfortunately it is far weaker than the Senate resolution. For example, it fails to prohibit military research or development of environmental‐modification techniques, and allows all ‘peaceful’ work on such things.”

So as long as a nation claims they are conducting peaceful weather modification, they are not violating the treaty. Further, there is no international body to enforce and punish violations of the treaty.

The Times also mentions the Department of Defense’s “Climate Dynamics” program, formerly known as Project Nile Blue. A 1976 report from Milton Leitenberg for the Federation of Scientists elaborates on the origins of Nile Blue. “Beginning in 1969, ARPA, the Advanced Research Projects Agency in the U.S. Department of Defense, began funding a project called “Nile Blue (Climate Modification Research),” Leitenberg wrote.

The Advanced Research Projects Agency (ARPA) was the predecessor to the Defense Advanced Research Projects Agency (DARPA), a secretive agency within the  Department of Defense. DARPA is known for developing exotic and emerging technologies for the military. These reports listed above indicated that Project STORMFURY and Project Nile Blue were some of the earliest known military operations conducted in the name of manipulating the weather, including hurricanes.

Leitenberg also noted two examples of times the U.S. has been accused of using weather modification on other nations. The was first related to alleged cloud seeding over Cuba in 1969 and 1970 in an alleged effort to destroy the sugar crops. In the second case, the director of the geographical research center of the University of Mexico implied that the United States was to blame for the effects of Hurricane Fifi over Honduras in 1974. A story from The Naples Daily News on July 15, 1975, expanded upon this claim:

“Dr. Jorge Vivo, director of the Geographic Research Center of the University of Mexico, said Monday the United States ‘artificially detoured’ the hurricane to Honduras to save Florida’s tourist industry. But Neil Frank, director of the National Hurricane Center in Miami, said Monday night U.S. officials did nothing to alter the hurricane’s path. Vivo told the newspaper El Sol de Mexico he held the United States responsible for 10,000 deaths and millions of dollars in damage caused by Fifi in the Central American nation. He said he believed U.S. weather authorities used silver iodide against Fifi as part of what he called ‘a systematic action’ to change its course.”

More recently, we have seen accusations that the CIA is manipulating the weather. In February 2015, while speaking at the annual meeting of the American Association for the Advancement of Science in San Jose, California, Professor Alan Robock discussed the possibility that the CIA is using the weather as a weapon of war. Robock has conducted research for the intergovernmental panel on climate change (IPCC) in the past. Robock said he was phoned by two men claiming to be from the CIA and asking whether or not it was possible for hostile governments to use geoengineering against the United States. Geoengineering is another form of weather modification that involves a range of different proposals for combatting climate change.

Despite a lack of concrete evidence to back these claims, we know the military has a history of testing weather modification and has specifically mentioned using the weather as a weapon. For example, In a 1996 document entitled “Weather as a Force Multiplier: Owning the Weather by 2025”  the U.S. Air Force discussed a number of proposals for using the weather as a weapon.

Whatever view you take of these projects, the fact remains that they helped spur the movement towards using computer models to attempt to predict the weather. Quite simply, the history of computer model weather prediction is intertwined with the military’s attempts to modify the weather. Weather historian James Fleming writes that the two men largely responsible for computer modeling are Vladimir Zworykin, an RCA engineer noted for his early work in television technology, and John von Neumann, a mathematician with the Institute for Advanced Study in Princeton, New Jersey. In 1945, Zworykin was promoting the idea that electronic computers could process and analyze mass amounts of meteorological data and issue accurate forecasts.

The eventual goal to be attained is the international organization of means to study weather phenomena as global phenomena and to channel the world’s weather, as far as possible, in such a way as to minimize the damage from catastrophic disturbances, and otherwise to benefit the world to the greatest extent by improved climatic conditions where ­possible,” Zworykin wrote. According to Fleming, Neumann agreed with this outlook, stating, “I agree with you completely. This would provide a basis for scientific approach[es] to influencing the weather.

Modern Hurricane Modification

In 2005, following the destruction left by Hurricane Katrina, USA Today wrote:

“In fact, military officials and weather modification experts could be on the verge of joining forces to better gauge, react to, and possibly nullify future hostile forces churned out by Mother Nature.”

On November 10, 2005, Dr. Joseph Golden, former manager of the National Oceanic and Atmospheric Administration and veteran of Project STORMFURY, testified before the Senate Subcommittee on Disaster Prediction & Prevention, warning about the need for hurricane modification.

After the horrendous devastation and loss of life from Hurricanes Katrina and Rita, I have been asked several times about the possibility of hurricane modification,” Golden stated. “I firmly believe that we are in a much better position, both with the science and the undergirding technology, than we were when Project STORMFURY was terminated. The need for a renewed national commitment and funding for weather modification research has become more urgent.”

Golden is also involved the Hurricane Aerosol and Microphysics Program (HAMP). In 2010, he gave a presentation discussing how the Department of Homeland Security asked the NOAA to organize a workshop on possible new scientific theory and approaches to hurricane modification in February 2008.

It seems likely that various agencies of the U.S. government began heavily investing in studying weather modification following the destructive hurricane seasons of 2005 and 2008. The idea that the U.S. government could be experimenting with controlling or steering hurricanes may sound like fantasy, but the fact of the matter is the government continues to invest in hurricane modification research. Is it possible that the U.S. government, under the direction of the CIA or the DOD, is working with private industries like General Electric to continue experimenting with weather modification technology? Should the public trust that government officials would fess up to secret experiments?

Japans bomb in the Basement

On Thursday, a shipment of 700 kilograms of plutonium arrived in Japan after a journey by sea from the French port of Cherbourg. That’s enough material for more than 100 nuclear weapons.

The plutonium – in the form of atomic fuel known as MOX, a mix of uranium and plutonium oxide – is for use in the Takahama-4 reactor, owned by Kansai Electric Power Co. and located on Wakasa Bay, in western Japan near Osaka.

There have been six shipments of such highly toxic cargoes since 1999, the result of an agreement to send radioactive spent fuel in Japan for reprocessing in France and the UK, and then to be shipped back as plutonium MOX fuel for use in Japan’s reactors.

Putting aside the reactor fuel issue for the moment, Japan’s plutonium program must be seen in the context of the nuclear arms proliferation dynamic that has existed for decades in Northeast Asia, but which today has taken on even greater urgency owing to North Korea’s nuclear weapon program.

Map of Japan's nuclear plants. Photo: Japan Atomic Industries Forum, 2016.

Map of Japan’s nuclear plants. Photo: Japan Atomic Industries Forum, 2016.

There is no question that Japan has the technical capability to build an advanced nuclear weapons arsenal.

There have been over the decades multiple references to it taking less than six months for Japan to build an atomic weapon – a credible timeframe if it’s true as reported more 20 years ago that a design or designs already exist in the country.

However, to build a ‘credible’ arsenal of weapons would require several years at least.

More important than any actual timeframe are the external factors that would lead a Japanese government to move to nuclear weaponization.

More important than any actual timeframe are the external factors that would lead a Japanese government to move to nuclear weaponization.

This debate is stirring in Japan. In a TV Asahi program on September 6, former Defense Minister Shigeru Ishiba suggested a review was needed of Japan’s so-called three non-nuclear principles: Not producing, possessing, or allowing nuclear weapons into Japan.

Ishiba asked the question if Japan is under the US nuclear umbrella then isn’t it necessary to allow US nuclear weapons into the country to deter threats from North Korea?

It’s clear that without a peaceful resolution to the underlying security threats in the region, there is an increasing possibility that policy makers in Tokyo – backed by Washington – will decide that Japan should weaponize its plutonium stockpile.


We have not reached that point yet, but without a fundamental change in thinking and policy, Japan’s nuclear bomb in the basement may not remain there for very much longer.

We have not reached that point yet, but without a fundamental change in thinking and policy, Japan’s nuclear bomb in the basement may not remain there for very much longer.

But back to Japan’s plutonium stockpiles and the question of why the only country attacked by a nuclear weapon and one that espouses the three non-nuclear principles has large amounts of the bomb-making material.

To answer that question requires looking back to the 1950s and a policy that was spearheaded by the United States, but soon adopted by Japan’s Science and Technology Agency established by former Prime Minister Yasuhiro Nakasone.

The policy was to build new types of nuclear power plants, or so-called Fast Breeder Reactors (FBRs), worldwide that would be fueled with plutonium reprocessed from spent uranium fuel. As FBRs produce more fuel than they burn  – hence the name “breeder” – they would in turn generate plutonium to fuel yet more FBRs.

The procedure was known as “closing the nuclear fuel cycle.”

While the idea seems a solution for processing spent fuel and producing more fuel for FBRs, the problem is fast breeder reactor programs failed worldwide, including in Japan.

Japan’s principle FBR started up in 1994 and was called Monju – named after a Buddhist deity for wisdom. However, a fire broke out at Monju 18 months after it opened, which shut the plant down for 14 years.

Monju nuclear reactor. Photo: IAEA Energy/Flickr

Monju nuclear reactor. Photo: IAEA Energy/Flickr

It restarted in May 2010, but weeks later a 3.3 metric-ton fuel exchange device fell into the reactor, which shut it down again for good, though to add to the fiasco its computers were later hacked and data stolen.

This effectively ended Japan’s FBR ambitions, though it took two decades and a total investment of more than US$10 billion for the government to finally make the wise decision to terminate Monju in December 2016.

However, Tokyo had other motives for commitment to a plutonium fuel cycle.

By the late 1960’s and early 1970’s, plans to build commercial light water reactors across Japan, such as at Takahama and Fukushima, faced strong opposition from local communities and activists.

To appease the opposition, the government and utilities said the new reactors would not become nuclear waste sites because the spent fuel would be shipped for reprocessing in the UK and France. This solved, temporarily, a major nuclear waste problem at least for Japan.

In total over 7,000 tons of such fuel went off to Europe during the decades up to the mid 1990’s.

During that time, the plants reprocessing Japan’s spent fuel at la Hague in France and Sellafield in the UK became synonymous with accidents, nuclear waste discharges into the ocean and atmosphere, and public health concerns.

While the Japanese contracts were lucrative for the two state owned companies that operated the Sellafield and la Hague plants –Cogema/AREVA in France and British Nuclear Fuels Limited (BNFL) in the UK – both were to become failed entities.

The Sellafield site is now managed by a UK government agency and absorbs most of the nation’s nuclear decommissioning budget estimated well in excess of US$100 billion.


Sellafield nuclear reprocessing site. Image: Sellafield Ltd.

In the case of AREVA, the company has debts running into the billions of Euros and is due to enter final corporate restructuring later this year with near zero prospects for future reprocessing operations.

There is another large wrinkle in this tale.

As failures engulfed Japan’s Monju fast-breeder reactor and shut it down, the government had to figure out what to do with the thousands of kilograms of plutonium that would be returning to Japanese shores to fuel a fleet of FBR’s that didn’t exist.

As failures engulfed Japan’s Monju fast-breeder reactor and shut it down, the government had to figure out what to do with the thousands of kilograms of plutonium that would be returning to Japanese shores to fuel a fleet of FBR’s that didn’t exist.

The answer, which brings us back to the cargo that arrived in Japan this week, was plutonium MOX fuel that could be used in existing commercial light water reactors.

The first MOX shipments in 1999 were for use in Fukushima and Takahama reactors.

However, in the case of the MOX delivered to Takahama, activists revealed that the fuel had been manufactured with falsified quality certification, leading to its return shipment to the UK.

In the case of the Fukushima plant, citizens from the prefecture, supported by evidence from Greenpeace, took Tokyo Electric Power Co., or TEPCO the plant owner, to court over the quality control of the fuel.

While the citizens group lost the case, AREVA was instructed to release vital safety data, which they refused to do. The ensuing controversy led the then Fukushima Governor Eisaku Sato to refuse to permit loading of the plutonium fuel.

It sat in the cooling pool at the Fukushima Daiichi reactor until August 2010 when TEPCO finally loaded the 32 assemblies of 235 kilograms of plutonium into reactor unit 3.

This was just six months before the Fukushima plant was hit by the strongest earthquake ever recorded in Japan and flooded by a tsunami that caused triple reactor meltdowns on March 11, 2011, including reactor unit 3.

A worker wearing a protective suit and mask works on the roof of the No.4 reactor building of Tepco's crippled Fukushima Daiichi nuclear power plant in Fukushima prefecture February 20, 2012. Reuters/Issei Kato

A worker in protective suit works on the roof of the No.4 reactor building of the  crippled atomic plant in Fukushima prefecture February 20, 2012. Reuters/Issei Kato

Without the actions of Japanese citizens and others around the world, TEPCO would almost certainly have spent the past decade through to 2011 loading many tons of plutonium MOX fuel into the Fukushima Daiichi reactors.

The meltdown of this fuel would have been far more severe and with greater onsite and offsite radiological consequences than the reality at the accident site today, which itself will take decades and tens of billions of dollars to clean up.

Worse still, tons of high temperature spent MOX fuel would have been sitting in Fukushima’s spent fuel pools.

If the Fukushima reactors had been loaded with plutonium MOX, then the warning from the Atomic Energy Commission to then Prime Minister Naoto Kan in late March 2011 that the loss of control at the spent fuel pools at the plant may require the evacuation of Tokyo, may well have become a reality.

Of the five reactors now operating in Japan, three are loaded with plutonium MOX fuel. However, the threat from Japan’s plutonium obsession could be about to get a lot worse.

Japan has built its own US$21 billion nuclear spent fuel reprocessing facility in Rokkasho-mura in Aomori prefecture, near Hokkaido. (Yes, the same Hokkaido North Korea has recently taken to firing missiles over.)


The Rokkasho nuclear fuel reprocessing plant in Japan’s Aomori prefecture. Wikimedia Commons.

The Rokkasho story sounds more than a little similar to Monju, just more expensive.

Rokkasho was supposed to be completed in 1997, but due to multiple construction and equipment failures, it was delayed and has since missed repeated start up dates. It’s now 20 years behind schedule and has a new opening set for 2018.

Assuming Rokkasho does eventually open, it was built to process spent fuel to produce plutonium primarily for use in fast-breeder reactors.

As pointed out, Japan’s only fast-breeder reactor, Monju, has been permanently shut so what happens to the 8,000 kilograms of plutonium Rokkasho was to produce each year?

The answer it appears lies in an atomic power plant being built at the northern tip of Aomori prefecture that will contain the Ohma Advanced Boiling Water reactor.

Now planned to start up in 2024, this reactor is intended to have a full MOX core, which would contain over 5 tons of plutonium and an annual demand of around 1.7 tons.

The safety implications of what would be a unique reactor worldwide operating with a full plutonium MOX core are enormous.

One reason why citizens and the city of Hakodate over the Tsugaru straits in Hokkaido have filed court challenges seeking to halt the Ohma plant’s construction. A court judgement is expected later this year.

Like Monju before, the prospects for operation of Ohma are dire and unlikely to solve Japan’s self inflicted plutonium hangover. But that also may be the point – the strategic and national security rationale for the program remains central for a government increasingly nationalistic in tone and outlook.

Under the guise of a civil nuclear program, Japan has become a de-facto nuclear weapons state without so far having to take that next fateful step.

Under the guise of a civil nuclear program, Japan has become a de-facto nuclear weapons state without so far having to take that next fateful step.

The MOX shipment this week is merely one further fig leaf for a plutonium and nuclear program that was always so much more than about energy.

North Korean leader Kim Jong Un provides guidance on a nuclear weapons program in this undated photo released by North Korea's Korean Central News Agency (KCNA) in Pyongyang September 3, 2017. KCNA via REUTERS

Kim Jong-un with nuclear weapon engineers in this undated photo released by Korean Central News Agency in Pyongyang September 3, 2017. KCNA via REUTERS

How long can the Japanese government defend such a policy? We may be about to see in 2018 when the US Japan Peaceful Nuclear Cooperation Agreement, that provides sanction for Japan’s program, is up for renewal.

Given the incumbents in the Prime Minister office in Tokyo and the White House, don’t expect much deep reflection (or policy reversal) on what it means for a nation in a region on the edge of major conflict to possess the largest stockpile of nuclear weapons plutonium outside the declared nuclear weaponized states.

Instead, ending this decades long multi-billion dollar program will, as ever, be secured by the dedication of the people of Japan and their allies around the world concerned as they are with public safety and real security built on peace.

Shaun Burnie is a senior nuclear specialist with Greenpeace Germany, Tokyo. He is co-author of “Nuclear Proliferation in Plain Sight: Japan’s Plutonium Fuel Cycle–A Technical and Economic Failure But a Strategic Success” Japan Focus, March 2016, available at He has worked on nuclear issues worldwide for more than three decades, including since 1991 on Japan’s plutonium and nuclear policy.

Nuclear physicist, Professor Frank Barnaby, is formerly of the UK Atomic Weapons Establishment and Director of the Stockholm International Peace Research Institute (SIPRI – from 1971-1981). Prof. Barnaby testified to the Fukushima District Court against TEPCO’s plans for MOX use in Fukushima Daiichi 3 in 2000, and is the author of multiple books on nuclear weapons design and policy.

Scientists Have Successfully Reversed Advanced Heart Failure in Mice

Studying Heart Failure

A team of researches from the Baylor College of Medicine in Houston, Texas have discovered a way to reverse severe heart failure, one of the leading causes of death from heart disease. The discovery, which involves “silencing” the Hippo pathway in the heart, could lead to better treatments for those at risk of heart failure — potentially ones that could eliminate the condition entirely.

Image result for heart failure

Contrary to its name, heart failure (also known as congestive heart failure) is not a condition in which the heart suddenly stops beating. Instead, it’s a condition in which the heart is unable to pump enough blood and oxygen to the rest of the body. It’s most likely to occur in those who have experienced a heart attack, during which blood and oxygen cease to flow to the heart. This lack of oxygen causes part of the heart muscle to die and be replaced by dead scar tissue, known as fibroblasts. Over time, the heart weakens to the point of being completely unable to support the body.

To study the condition, the team from Baylor College started by creating a mouse model that could be substituted for a human heart suffering from heart failure. Animals models are often used to study the same/similar diseases and conditions that can affect people. In the case of heart failure, the hearts of mice and humans are incredibly alike, making the former perfect for studying the heart and testing potential treatments.

“One of the interests of my lab is to develop ways to heal heart muscle by studying pathways involved in heart development and regeneration,” explained Dr. James Martin, Vivian L. Smith Chair in Regenerative Medicine at Baylor and corresponding author of the research. “In this study, we investigated the Hippo pathway, which is known from my lab’s previous studies to prevent adult heart muscle cell proliferation and regeneration.”

Improving the Heart

The decision to hamper the Hippo pathway came as a result of the increased activity that occurs within it during heart failure. According to John Leach, primary author and graduate student of molecular physiology and biophysics at Baylor College, the team believed if they could turn it off, the heart may improve. Turns out they were right.

“Once we reproduced a severe stage of injury in the mouse heart, we inhibited the Hippo pathway,” said Leach. “After six weeks we observed that the injured hearts had recovered their pumping function to the level of the control, healthy hearts.”

The Centers for Disease Control and Prevention states that over 5.7 million people in the U.S. live with heart failure, with about 50 percent of people diagnosed expected to die within five years. While Dr. Martin and Leach’s research yields favorable results, there is more work that needs to be done with regards to hindering the Hippo pathway.

Specifically, turning off the Hippo pathway has two effects: one is increased muscle cells capable of surviving inside the damaged heart, while the second is the altered fibrosis. Fibrosis plays a key role in the dead scar tissue that forms during a heart attack. Before any test can be done on people, the team will need to ascertain a better understanding of the changes in fibrosis. Hopefully that leads to a positive discovery that can save more people sooner, rather than later.

Nearly all medical studies are “totally bogus,” warns science writer who echoes the Health Ranger’s criticism of sloppy science

Image: Nearly all medical studies are “totally bogus,” warns science writer who echoes the Health Ranger’s criticism of sloppy science

For many years, Mike Adams, known to millions as the Health Ranger and the founder of Natural News, has been blowing the whistle concerning the loss of scientific integrity and the destruction of the fundamental principles that undergird scientific research, all in the name of the almighty dollar. He has exposed the lies and corporate fraud behind GMO foods, the death culture propelling the war on carbon and the government corruption that props up pharmaceutical firms who kill millions with chemotherapy, toxic vaccines and dangerous medications. Adams and his team have valiantly sought the truth, regardless of trolls, paid critics and Google’s thought police. And thankfully, he’s not alone.

Richard Harris, an award-winning journalist who has reported on and “traveled to all seven continents” as the science correspondent for National Public Radio (NPR), echoes the Health Ranger’s sentiments in his new book entitled “Rigor Mortis: How Sloppy Science Creates Worthless Cures, Crushes Hope, and Waste Billions.” The level of scientific failure Harris speaks about, which was reported by the New York Post, “costs taxpayers excesses of $28 billion a year.” The science writer turned industry critic says wasted dollars, time and the abject failure of non-reproducible studies means that the data and results from any scientific peer-reviewed paper should be taken “with a grain of salt.”

For scientific research to be considered legitimate, other scientists must be able to replicate the results. But that only happens in about half of researched case studies. Even more troubling, up to two-thirds of what is considered “cutting-edge reports, including the discovery of new genes linked to obesity or mental illness are later disconfirmed.” That’s a fancy word for being exposed as a blatant lie. Here’s one example. Thousands of studies were published in prestigious scientific journals which concentrated on melanoma cells and their relationship to breast cancer. Later, it was discovered that the melanoma cells utilized in all those studies were the wrong cells. Harris lamented, “It’s impossible to know how much this sloppy use of the wrong cells has set back research into breast cancer.”

There are many reasons why modern science is in panic mode and the historically rigorous scientific method has been all but eradicated. Harris believes that all scientists carry an “unconscious bias.” They just can’t see outside their own theses. The brutal competition for funding dollars is another huge issue. Government funding has dropped from 33 percent to 17 percent over the past thirty years. Post-doctoral employment is dwindling, giving “a greater incentive to publish splashy counterintuitive studies,” which are often just plain wrong.

The pressure for funding also pushes scientific groups to publish papers that may state a particular hypothesis, even if facts scream otherwise. A prime example of this is the fact that the CDC still proclaims mercury in vaccines is safe, despite evidence to the contrary. All of the Monsanto studies that claim GMOs are perfectly safe are another. The current system also rewards people who are first to postulate a new area of study, even if that research doesn’t stand the test of time.

Stanford professor of medicine John Ioannidis has written about the problems of reproducing scientific results. He’s discovered “tens of thousands of papers” linking certain genes to diseases like depression or obesity and out of all those published scientific papers, only 1.2 percent “had truly positive results.” The professor has also discovered research that has been cited thousands of times, but turns out to be absolutely false. Professor Ioannidis spoke in Chicago in 2016. His speech was titled, “Evidence based medicine has been hijacked.”

Yes, it has. And it’s up to us to demand that science and truth are reunited.

Sources include:

Multi Area News and Information Blog