## Cold Fusion – NASA – LENR Part II Flight

The vision of Earth provided by NASA lunar missions is a powerful image; possibly the most potent archetypal image of our times. This image brings to mind the beauty of the biosphere, our world and life-as-we-know-it, surprisingly small against the vast starlit darkness of space.

NASA sees LENR energetics in concert with advanced computer and flight technologies as, “The key to supersonic transports and neighbor-friendly personal fly/drive air vehicles.” (NASA)

This technology could replace much earth bound transport; roads and their inherent environmental damage would become obsolete.

NASA realizes the fragility of our biosphere and seeks to limit atmospheric damage from aeronautics and transportation in amazing ways:

• Turbo-electric Distributed Propulsion (NASA pdf)
• The SUGAR Program (SUGAR – Subsonic Ultra Green Aeronautics Research) was initiated in 2008 as a challenge to four that received contracts, Boeing, GE Aviation, Massachusetts Institute of Technology and Northrop Grumman. The goal is a deep reduction in harmful emissions from airplanes and to decrease their noisome irritation. “Hybrid electric engine technology is a clear winner because it can potentially improve performance relative to all of the NASA goals.” (Boeing)
• The SUGAR Volt design utilizes electric turbo fans; which are candidates for LENR electrical power. SUGAR VOLT http://www.boeing.com/stories/videos/vid_06_sugarvolt.html
• NASA Green Flight Challenge – “NASA has awarded the largest prize in aviation history, created to inspire the development of more fuel-efficient aircraft and spark the start of a new electric airplane industry. The technologies demonstrated by the CAFE Green Flight Challenge, sponsored by Google, competitors may end up in general aviation aircraft, spawning new jobs and new industries for the 21st century.” Green Flight Challenge Sponsored by Google – (Final Results 2011)
• “Faster and Greener– Pocket Airports” (NASA GFC pdf)
• Here Comes the Electric Plane http://www.youtube.com/watch?feature=fvwrel&NR=1&v=E7u-JX6AAKo
• Txchnologist The Future of Transportation – “Mapping Out the Future of Flight” (GE)

ON A FALLEN TREE ACROSS THE ROAD (To hear us talk) by Robert Frost

The tree a tempest with a crash of wood

Throws down in front of us is not to bar

Our passage to our journey’s end for good,

But just to ask us who we think we are,

Insisting always on our own way so.

She likes to halt us in our runner tracks,

And make us get down in a foot of snow

Debating what to do with an ax.

And yet who knows obstruction is in vain:

We will not be put off the final goal

We have hidden in us to attain,

Not though we have to seize earth by the pole

And, tired of aimless circling in one place,

Steer straight off after something into space.

##### Rocket Toxicity

Over 4,000 (Wiki) recorded space launches and an unknown number of missile launches have burned hundreds of millions of tons of the following propellants, oxidants, and rocket elements.

Ammonium-perchlorate, kerosene, ammonium-nitrate, hydroxyl-terminated-polybutadiene, polyurethane, aluminium, polyisocyanate, ammonium-dinitramide, acrylonitrile, iron-oxide, glass, carbon, boron, phenylenediamine-terephthaloyl-chloride, poly-paraphenylene-terephthalamide, cyclotrimethylenetrinitramine, cyclotetramethylene-tetranitramine, nitrocellulose, nitroglycerine, Hexanitrohexaazaisowurtzitane, polybutadiene-acrylonitrile, unsymmetrical dimethylhydrazine, dinitrogen tetroxide, and others not accounted for.

Who knows how these recombine after combustion, with each other and with atmospheric elements?

The engines powering the Space Shuttle’s initial liftoff boosters may have been the most polluting engines ever operated by mankind. For each kilogram of payload, the shuttle’s main boosters burn 30 kilograms of  fuels and oxidizers.

During 135 missions 122,472,000 kilograms (135,002 tons) of this highly toxic fuel was burned in the solid fuel boosters of the Space Shuttles.

Approximate Amount Burned (tons)

• 94,365   Ammonium Percholate
• 5,600    Powdered Aluminum
• 9,450    Iron Oxidizer Powder
• 16,204   Polybutadiene Acrylic Acid Acrylonitrile
• 2,646    Epoxy-curing Agent

21st Century Timeline of U.S. Rocket Fuel Pollution Scandal (read)

Perchlorate is a powerful oxidant that has been detected in public drinking water supplies of over 11 million people at concentrations of at least 4 parts per billion (ppb). High doses of perchlorate can decrease thyroid hormone production by inhibiting the uptake of iodide by the thyroid. Thyroid hormones are critical for normal growth and development of the central nervous system of fetuses and infants.

A Summary of NASA and USAF Hypergolic Propellant Related Spills and Fires (pdf)

The fuel is monomethyl hydrazine (MMH) and the oxidizer is nitrogen tetroxide (N2O4) which is similar to ammonia. Both fluids are highly toxic, and are handled under the most stringent safety conditions. Hypergolic propellants are used in the core liquid propellant stages of the Titan family of launch vehicles, and on the second stage of the Delta.

The Space Shuttle orbiter uses hypergols in its Orbital Maneuvering Subsystem (OMS) for orbital insertion, major orbital maneuvers and deorbit. The Reaction Control System (RCS) uses hypergols for attitude control.

NASA is hoping to reduce launch emissions for space flight with LENR.

Cold Fusion – NASA – LENR Part Three Earthbound and Spacebound Transportation

## Exponential production using LENR, LENT, and 3D Printing

Exponential Growth is an immensely powerful concept. To help us grasp it better let us use an ancient Indian chess legend as an example.

The king was a big chess enthusiast and had the habit of challenging wise visitors to a game of chess. One day a traveling sage was challenged by the king. To motivate his opponent the king offered any reward that the sage could name. The sage modestly asked just for a few grains of rice in the following manner: the king was to put a single grain of rice on the first chess square and double it on every consequent one.

Having lost the game and being a man of his word the king ordered a bag of rice to be brought to the chess board. Then he started placing rice grains according to the arrangement: 1 grain on the first square, 2 on the second, 4 on the third, 8 on the fourth and so on.

Following the exponential growth of the rice payment the king quickly realized that he was unable to fulfill his promise because on the twentieth square the king would have had to put 1,000,000 grains of rice. On the fortieth square the king would have had to put 1,000,000,000 grains of rice. And, finally on the sixty fourth square the king would have had to put more than 18,000,000,000,000,000,000 grains of rice which is equal to about 210 billion tons and is allegedly sufficient to cover the whole territory of India with a meter thick layer of rice. At ten grains of rice per square inch, the above amount requires rice fields covering twice the surface area of the Earth, oceans included.” ( http://www.singularitysymposium.com/exponential-growth.html ).

Exponential Growth is a difficult concept to imagine, but it is extremely powerful. This paper is devoted to utilizing three technology to achieve exponential production. The conceptual framework I am constructing is not just abstract, but reality based and achievable. In other words, this paper is a prescription for an almost unimaginably powerful result.

The three factors of production required for industrialization are capital, labor, and land. In other words, goods devoted to the production of other goods, activity that provides the goods, and raw materials.

“As opposed to traditional “subtractive” methods of carving or sculpting, 3D printing is an “additive” method of manufacturing that builds up solid objects one thin layer at a time. The basic concept is the same as an inkjet printer, only instead of spraying ink onto paper, 3D printers use liquids that solidify or set. Liquid plastic or resin are the usual materials, but there are others: for example, industrial 3D printers can make metal objects by laying down a pattern of metal powder and then fusing it with a high-powered laser or electron beam. You can 3D print in ceramic, glass, or concrete or other composite materials by depositing layers of sand or gravel and then spraying a binding agent…Truly revolutionary advances often come quietly at first, and I believe this is one of these. 3D printing as a technology is in its very early stages, but even in what it’s accomplished so far, we can glimpse the contours of the future.”
(http://bigthink.com/daylight-atheism/weekend-coffee-the-3d-printing-revolution). 3D printer technology is the quintessential “goods devoted to the production of other goods,” a virtual magic lamp.

Low energy nuclear reaction (LENR) is a clean, very cheap, and super abundant energy technology using (for instance) nickel and hydrogen. It has been estimated that evaporating hydrogen in a nickel lattice, applying heat, pressure, and a vibrational wave (i.e. “Q-wave”) produces 355,000 times the heat from hydrogen than an equal amount of gasoline. Furthermore, hydrogen and nickel are very plentiful elements in our universe. Such plentiful energy utilized by mechanical devices would fuel the activity that provides the goods.

Finally, low energy nuclear transmutation is a technology to turn one element into another. In the nickel lattice of a LENR reactor, hydrogen atom protons and electrons collide forming ultra-low momentum neutrons. Those neutrons are absorbed by surrounding atoms. Those surrounding atoms that have absorbed the extra neutron(s) shed heat and transform into other elements. Such a method for producing one element from another would allow the production of any needed raw material.

Sounds like science fiction. To digress slightly, I first was clued into the practicality of 3D printing when I was researching a new Nobel gas engine: “They recently purchased a 3-D printer in order to build the proper pipe sizes on which to wind the coils that go on their cylinders (and to play around). The 3-D printers only cost $5000. I remember back when we paid that much for a regular copy machine,” he said. ( http://theeestory.ning.com/forum/topics/inteligentry-manufacturers-gearing-up-for-noble-gas-engine-roll ). The main content of the article (about an engine that runs on “plasmic transition” process using noble gases to create the plasma) is amazing enough, but I didn’t realize how far 3D printer technology had come. “President Obama’s nationwide push for innovation in manufacturing reaches across agencies from the National Science Foundation to the Department of Energy, and now it’s reaching all the way into the Pentagon where$60 million is being set aside for investment in 3-D printing technologies. The DoD will fund a network of agencies, academic institutions, and companies to build on 3-D printing tech with the overarching goal of building aerospace and weapons technology faster.

Of that $60 million, half will be allotted to researchers between now and fiscal 2014, with more than half of that–some$18.8 million–being handed over in fiscal 2012 alone. That means, adjusting for the usual bureaucratic waste, there should be somewhere between many and many-many millions spent to advance 3-D printing tech this year alone under a framework that will hopefully push for the meeting of meaningful benchmarks.

Three-dimensional printing (or additive manufacturing, or rapid prototyping) is of course a fairly nascent technology that nonetheless holds great promise. While private companies like Makerbot, Stratasys, and even Hewlett-Packard have pushed the boundaries of the technology by developing less-expensive and more accessible printing systems to more people, the industry on the whole hasn’t really benefited from a huge injection of investment or a meaningful mandate from a body like the DoD–one that, when it puts its mind and money to something, can actually enable technological leaps forward.

The 3-D printing industry was already doing fine–some analysts expect it to grow to $3.1 billion by 2016–but a little help from Uncle Sam can’t hurt.” ( http://www.popsci.com/technology/article/2012-05/pentagon-investing-millions-advance-future-3-d-printing-tech ). While the “overarching goal of building aerospace and weapons technology faster” is a short term goal that is easily achievable, a much more powerful goal would be for 3D printers to produce other 3D printers, increasing production capacity exponentially. Suppose for example a 3D printer and toner were to be transported to an asteroid. The printer can start manufacturing equipment to mine for fuel and raw materials, and as production was scaled up could duplicate the production economy. One production economy, two, four…limited only by time and the land. By the way, 3D printers are transforming digital data into goods, so innovation and adaptation from afar can be input via telecommunications, re-directing a production economy to produce different goods (like space craft to transport a spare 3D printer and toner to another piece of extra-terrestrial land to repeat the cycle). This is where I go off the deep end. Remember we started this paper with a story of Exponential Growth (“And, finally on the sixty fourth square the king would have had to put more than 18,000,000,000,000,000,000 grains of rice which is equal to about 210 billion tons and is allegedly sufficient to cover the whole territory of India with a meter thick layer of rice.”): “In astronomy and cosmology, dark matter is a type of matter hypothesized to account for a large part of the total mass in the universe. Dark matter cannot be seen directly with telescopes; evidently it neither emits nor absorbs light or other electromagnetic radiation at any significant level. Instead, its existence and properties are inferred from its gravitational effects on visible matter, radiation, and the large scale structure of the universe. Dark matter is estimated to constitute 84% of the matter in the universe…” ( http://en.wikipedia.org/wiki/Dark_matter ) I am making the following outrageous and totally unintuitive claim: using the above suggestion for exponential production, the dark matter could plausibly be spaceships filled with aliens. That is how powerful Exponential Growth is (both in terms of population and production). To summarize, Exponential Growth is an immensely powerful concept. The three factors of production required for industrialization are capital, labor, and land. Using 3D printing, LENR, and LENT, exponential production potential can be achieved. While each of these are fairly nascent technologies, they nonetheless hold great promise. Furthermore, you can expect in the future that each of these technologies will have directed at them tremendous R&D resources – and a little help from Uncle Sam can’t hurt. Finally, using the concept of Exponential Growth you can even plausibly explain such inscrutable things as Dark matter. We really are primarily limited by our imagination. The future is so bright we’ll have to wear shades – the only catch is we have to believe. ## Session 462 Advanced Concepts: LENR, Anti-Matter, and New Physics [latexpage] On Friday, March 23 I attended Session 462 Advanced Concepts: LENR, Anti-Matter, and New Physics of the Nuclear and Emerging Technologies for Space conference, one day after speaking with George H. Miley who would be presenting A Game-Changing Power Source for Spacecraft at the session. Part 1 of the event was an account of my talk with Professor Miley. Part 2 continues with this paraphrase of the four talks included in Session 462. Unable to obtain video of the event, an audio recording formed the basis of this summary. The Session Chair was Harry “Sonny” White, a member of the Johnson Space Center Advance Propulsion Team, and he put together an ambitious group of speakers. Spacecraft power and propulsion systems based on LENR, anti-matter, and quantum vacuum flucuations are clearly being eyed by NASA. A new laboratory Eagleworks to research “speculative” technologies like the quantum vacuum thruster is currently being set-up at the Johnson Space Center with the participation of Dr. White. Moving from chemical energy to clean and abundant cold fusion for domestic energy needs has its parallels in space technology: humans will flip into another arrangement for living on Earth and in space, as propulsion systems based on new energy drop travel times to the outermost planets to six months, and humans become able to travel to the nearest star within a lifetime. Following the talks, each speaker got a Cold Fusion Now sticker, and I left a few on the table for others to pick up. Note that this piece has Graphic Scientific Content! Y. E. Kim Cryogenic Ignition of Deuteron Fusion in Micro/Nano-Scale Metal Particles 3006.pdf Y. E. Kim gave his talk on Cryogenic Ignition of Deuteron Fusion in Micro/Nano-Scale Metal Particles which described a Bose-Einstein Condensate Nuclear Fusion theory for cold fusion and suggested experiments to test his hypothesis. Professor Kim originally rejected the claims of cold fusion, but his interest was re-kindled by the depth of experimental research over the years, as well as a Defense Analysis Report that supported the claims of low-energy nuclear reaction LENR scientists. His work was motivated by the 1929 discovery of A. Coehn showing that protons move through metal as an ion. This fact leads to many scenarios, one of which is high-density deuterons forming as a Bose-Einstein Condensate BEC. He acknowledges that BECs are known to only form at very low temperatures saying, “We will see if that can happen in a metal as well.” If a Bose-Einstein Condensate BEC can form inside the metallic lattice at room temperature, then Professor Kim describes how LENR could be modeled using standard physics thereafter, with no new physics is required. He starts with Schrodinger’s Equation and uses traditional quantum theory to get his solutions. The only unknowns are$\mathbf \omega$, the rate of forming condensate, and the strength of the nuclear force S, two parameters that can be determined if they perform a proposed experiment. Professor Kim made a distinction between physics that occurs in free space and physics that occurs in a bound environment like a metallic lattice, citing the experimental fact that high-density deuterons can form clusters in materials. Supposing that the condensate can form, then all of the deuterons will go into ground state. “If that state happens, it’s a coherent one-state, and the deuterons behave like one object. That’s a big difference from free space, in that within a lattice, you can form this sort of condensate.” His theory can explain many of the experimental observations as well as the three miracles of cold fusion, referring to the failure of experimental results to meet the expectations of conventional hot fusion theories cited by John Huizenga in his derisive and obsolete work Cold Fusion: Scientific Fiasco of the Century. The three miracles are listed as i) the lack of strong neutron emissions; ii) the mystery of how the Coulomb barrier is penetrated; iii) and the lack of strong emission of gamma rays or X-rays. For instance, the lack of gamma radiation violates conservation of momentum in free space. But LENR does not occur in free space, and Professor Kim says “If the entire condensate takes up the energy and shares the momentum, and you no longer have to satisfy conservation of momentum, it has to explode like a star.” He notes the micro craters that have been observed from many experiments indicating possible tiny explosions. He calls for three experiments to be conducted to test his hypothesis. The first experiment would determine whether or not a BEC can indeed form inside a metal at room-temperature. If a BEC forms, you can then measure the velocity distribution of the deuterons with low-energy neutron scattering or high-energy x-ray scattering off the deuterium in the metal, as was done in the atomic case. As a second experiment, Professor Kim would like to know if the rate of deuterium diffusion occurs faster than protons when a condensate forms. He expects that to occur. Experiments number 1 and number 2, if confirmed, would be a new discovery. The third experiment Professor Kim calls for is a little more ‘practical’. He is proposing an experiment for the National Ignition Facility NIF at Livermore where they have been cooling deuterium-tritium spheres. The spheres are targets for lasers in their attempts to induce nuclear fusion. By cooling the spheres, they can get a perfect sphere, which helps the implosion needed to induce fusion for this type of system. Says Professor Kim, “We can take advantage of that cooling system and reaction chamber already built, and produce deuterium nano-particles in a 1-cm sphere, and by applying an appropriate oscillating electromagnetic field at a low-temperature, make them explode.” His formula tells him that taking a 1-cm sphere filled with deuterium nano-particles will provide$10^{19}$reactions per second. Designing the system to be slow-burning can provide power as rocket thrust. In order to succeed with these experiments, Professor Kim says, “It could take 5-10 years to come up with a mature system. But if you wanted to do it right away, you could do a Manhattan-type Project and do it in a few years.” “If we succeed, this is a potentially revolutionary, disruptive technology for the world.” His excitement about LENR as both a science and a technology was palpable. He had a sense of humor too. When I asked to take his picture at the end of his talk, he laughed and said “Sure, but I don’t want to be a news media!” G. H. Miley A Game-Changing Power Source Based on Low Energy Nuclear Reactions LENR 3051.pdf George H. Miley spoke next on A Game-Changing Power Source Based on Low Energy Nuclear Reactions LENR. After acknowledging his co-authors Xiaoling Yang and Heinz Hora, he began with a brief history of the previous experiments that motivated his current research on gas-loaded nano-particle research for which he envisions many applications, one of which is a heat-producing reactor to replace plutonium 238 in RTGs. His cells today are quite a deviation from the original Pons and Fleischmann cold fusion. It was the announcements coming out of Italy and Greece about commercial megawatt units using hydrogen and nickel nano-particles that inspired him to work on his current designs. “Since I’d been working on this since the days of Pons and Fleischmann doing low-level physics research, I decided well, I would jump into that too.” But instead of megawatt plants, Professor Miley likes to do things small scale, “so we’ve been doing about 100 Watt studies because we can get at them and do a quick turn around.” He related his research to Y. E Kim‘s Bose-Einstein Condensate hypothesis. He believes that deuterium clusters are somewhat similar to BECs, in that they interact through multi-body reactions. “But in the early days, Heinz Hora and I were led by a different theory; we talked about Swimming Electrons Layers which were being made by putting together multi-layers of thin-films with different Fermi levels in the thin films. You get a very large electron density at the interface, and that was to help overcome the Coulombic repulsive force between the charged particles coming together, and allow reaction rates to occur.” Dr. Miley then devised his unusual thin-films electrode design. He arranged the anode and cathode so that the electric field would go parallel to the layers and this would cause electro-migration of the protons through the length of the thin-films. “It was general wisdom at that time that you need a high loading of the reacting species in the metal, and you also need a flow of them; it’s a dynamic process”, he said. “So this was to accomplish both of those, and it worked.” Changing the voltage applied to the system changes the electric field over time and caused protons to flow through the metal, resulting in 20-30% excess heat for his experiments, excess heat being the measured amount of heat out minus the equivalent amount of heat by electrolysis put into the cells. But excess heat was not the only effect Professor Miley witnessed. “I spent something like two years noting that there are a number of reactions taking place which led to transmutation of metals in the thin films themselves, so this wasn’t a normal fusion experiment. In fact it’s very complicated. You form these compound nuclei, some of them actually fission. You end up with a variety of metals including copper and iron and various things inside these plates.” By using CR39 and plastic tracking, his team looked for energetic charged particles generated by the cell. “We had something like a couple MeV protons and 12 MeV alpha particles coming out of this but at a very low rate. Those reactions seem to be side reactions not accounting for heat which is mainly accounted for by transmutation reactions in the metals. So this is quite a bit different from the original cold fusion.” After a long time looking at the craters and pits left in the films, “It suddenly began to sink in that this reaction is occurring in local spots – it isn’t uniform everywhere.” “So I began thinking we have to make more of these local spots whatever they are. It’s been called ‘nuclear reactive spots’, but no one knew how to make them. Our way to make them was we purposefully make thin-films with voids and dislocations by working them.” “What happens is in that region where you get a very high density of hydrogen or deuterium, you get a condensation of the type that Dr. Kim was talking about in the previous talk. There may be as many as a thousand atoms there. That is quite remarkable.” Superconducting Quantum Interference Device SQUID measurements show that this region is superconducting at temperatures below 70 degrees. Says Professor Miley, “Forget about the superconducting, that means it’s darn dense.” “When we heat the samples under vacuum, driving the deuterium and hydrogen out, we find that normally if we don’t treat it this way, and make the defects, you get a broad temperature range. With the defects, you get this coming out right here. You have to heat it up to a temperature corresponding to 0.6 eV or higher. Our present ones are higher.” “That indicates the binding energy for the cluster; we call this grouping of metallic density atoms a cluster. The name of the game is to get lots of those clusters.” “We decided about this time to change gears and try to do somewhat the same thing with nano-particles. The logic is if you have plates, most of these clusters were forming in damaged spots near or around the surface. If we have nano-particles you can get them [the deuterons] all around the surface of the nano-particles. You pack the nano-particles in, and you just get a lot more sites per unit volume than the planar configuration.” “It was during that time that the Rossi announcements were coming out, and they were getting great results with this. Prior to that though, people in Japan had very interesting results also. I think they instigated this approach, Arata, Takahashi and others. Actually we were following the Japanese work more than the Italian work.” “Now our experiments are simple, even undergraduate students can do it, which are all I have!” he laughed. A 25-cm long tube gets filled with 23 grams of nano-particles. That goes into another vacuum chamber mainly for temperature control to limit the amount of heat transfer slipping out. “We have the cylinder of deuterium or hydrogen here. After pumping it down to a vacuum, you load it with gas. Hopefully if you do all that right, it heats up, and you’re off and running.” “We’re studying various types of nano-particles. That’s probably the most difficult of all of this. We have four different alloys; nickel-rich alloys for hydrogen gas reactions and palladium-rich alloys for deuterium ones, and we make those ourselves.” “What happens is really intriguing here with this palladium-rich nano-particle run. It’s a little confusing to see three different sets of thermocouples, but you can just ignore that, we have thermocouples in different places. When you first start loading, the temperature jumps up, (we’re going up to four atmospheres), and the first part of that we attribute to the deuterium is going into the palladium which is an exothermic, so it’s a chemical heating but you get more out than that.” “The real intriguing part is when we desorb by reducing pressure suddenly, after a couple hundred seconds. Normally that’s endothermic, the temperature should drop rapidly, but instead it goes up.” “And I attribute that to, as I said, all of our systems have a flow. We have to keep creating a flow that has a diffusion of deuterium or hydrogen into the cluster so we can transfer momentum to create the reaction.” “I think we’ve accounted for all the chemical reaction energy maybe 700 Joules. We measured 1400-1500 Joules, and so calculating over that short period of time, that’s 350 Watts per kilogram at 4 atmospheres.” “Now that was a very short time period. If you time integrate all that your getting a considerable number of joules out of what we’re calling LENR reactions compared to chemical reactions, and you might say well that’s very interesting but we certainly want it to last longer.” “What’s happening is, if you don’t change the pressure, if you don’t control it so to speak with pressure changes, then you don’t have the flow. Once you just pressurize and stop, you stop that flow as time goes on. So this is expected.” “Incidentally, reactions are continuous. You might think that maybe the reaction stopped here and it’s cooling down. It’s not cooling as fast as if the reactions weren’t continuing. But they’re continuing at a low pace and aren’t able to keep up. So the dashed line is what would happen if there are no reactions over that same time period.” “This depends in another way on loading. There is a lot to do with the thermal characteristics the temperature across the bed. You want to control the temperature profile, and you want to control the pressure pulses.” “If this works, it certainly would be game-changing not only for space but for other uses.” One of the targets Professor Miley thought would be “really neat” as an application was the General Purpose Heat Source GPHS for a Radioisotope Thermoelectric Generator RTG. There’s a lot of concern from spacecraft engineers about the future sources of Plutonium 238. “This would fit right in there”, says Professor Miley, “and it’s really neat that the energy conversion part is already there.” “If you take the numbers from our experiments, this GPHS is the standard power source with 3 kilowatts. We get about 3 kilowatts from 3 kilograms, although, the volume must be larger with the way we pack the nano-particles in. “But there are many issues. One of the big ones is control over a steady-state period over long periods of time like months, years and so on. Deep space probes are supposed to run for decades, and I’ve showed you the nano-particles after a run, so I’m not sure if I’ve bit off more than I could chew when I thought of this. But it doesn’t take much imagination to think of other applications that would run for a year or two years and you’d be happy, so I think that’s more of what might develop.” “And the second one is you don’t want the nano-particles to disintegrate over that time period and become unproductive or you have to figure out a way of exchanging them periodically.” “In summary, this gas-loading appears to be exceedingly energetic. We’ve put virtually no energy in. If we’re right about ruling the chemical energy contributions out, then we’re getting extremely high gains of excess heat. This is very significant power source; a new type of nuclear power source.” “We’ve become very enthusiastic about this so we’ve formed a company LENUCO to try and commercialize this. So we’re really serious about this.” At the conclusion of his talk, Professor Kim asked a question: “Since you are forming a company, you are not going to disclose how you make the nano-particles?” “I can tell you roughly how we make them. That is we first dream up an alloy we think is good, and have someone make that alloy, and then we do a special heat treatment of these nano-particles and then we do some stressing of them to try to form these voids. But you’re right, the company now has a patent on this, so the details I can’t disclose yet!” Session Chair Harry “Sonny” White Advanced Propulsion Physics: Harnessing the Quantum Vacuum 3082.pdf with P. March As a member of the Advanced Propulsion Team at Johnson Space Center, Harry “Sonny” White has been researching multiple forms of advanced electric propulsion systems with the goal of integrating them into the architecture for human space travel. The team has experimented with some of the new and emerging solar panel technology, high-power electric propulsion and power systems for the International Space Station as an experimental test bed, and to provide re-boost. He’s also looked at modular free-flyer space systems to provide a platform that they could potentially “evaluate some of the emerging forms of propulsion technology”. The long list of prior projects has given Dr. White a good feel for some of the metrics and performance characteristics that these systems need to have in order to support human spaceflight. He made a point to say how impressed he was by Professor Miley’s specific force numbers for his thruster outlined in a previous talk. Specific Impulse (ISP) mathematically said, is how many seconds 1lb of fuel can provide 1lb of force. In conceptual terms, ISP is the efficiency with which a rocket can convert chemical (or nuclear) energy to kinetic energy. In terrestrial terms, ISP can be thought of as a form of “miles per gallon” for a rocket motor according to Dr. White in his Revolutionary Propulsion & Power for the Next Century of Space Flight Von Braun Symposium from October 2009. [.pdf] Dr. White is part of an effort at Johnson Space Center to implement an advanced physics propulsion laboratory named Eagleworks to “pursue advanced physics concepts emerging in the literature”. “When you pursue things that fit into the category of speculative physics, you have to be very careful about what you’re doing, you have to be rigorous and due diligent, take your time and try to be your own worst critic. So we are setting up some facilities at Johnson Space Center that are very high fidelity systems that try and work in the realm where physics and engineering overlap”, he says. “We have a low-thrust torsion pendulum that we’re putting together and is going to be the backbone of some of the activities that I’m going to be talking about to you today. We also have an interferometer that I’ll be using to measure some relativistic effects, not the subject of my discussion today, but fits in the realm of speculative physics.” “We’ve refurbished an older test article to try and exercise it in this higher-fidelity laboratory setting. This is a low-thrust torsion pendulum. We’re working to try to get the detection threshold down to the single micro-newton levels. Now the thrust is much higher than that, but we want a good signal-noise ratio. This is a vibration isolation facility and you really need that type of setup so you don’t see trucks driving down the street that can actually introduce signals that you can see if you don’t have the proper vibration isolation.” The team is developing an even larger thruster. “We’ve had some experience in the 4000 micro-newton range with around 10 Watts of input power”, said Dr. White. “But we’re trying to get more experience across a broader number of input parameters to help us understand if we have a good handle on the physics and engineering.” “We’re always keeping an eye on potentially using this for propulsion systems for human spaceflight. Some of the specific force numbers are very competitive when you’re looking at Hall thrusters, so we’re looking to see if there’re places these can be used for human spaceflight and what type of missions that they would enable if this technology is successful.” “Can the properties of the quantum vacuum be used to propel a spacecraft?”, he asked, noting that it is not a new question. Arthur C. Clarke had earlier coined the term quantum ramjet drive. Clarke’s perspective was that “If vacuum fluctuations could be harnessed for propulsion, then certainly our lives would be a lot easier for human space exploration.” “When we view this question through the ‘classical muscle memory’ in engineering, the answer to that question is no, because there is no reaction mass that can be used to conserve momentum. You have to conserve momentum, you have to leave a wake.” “However when you look at things from a quantum perspective through QED, a very successful model, it also predicts that the quantum vacuum in the lowest energy state is not empty, but rather is a sea of virtual particles and photons that pop in and out of existence stemming from the Heisenberg Uncertainly Principle.” One of the earliest vacuum models from Paul Dirac actually predicted the electron’s anti-particle the positron in 1928, it was later confirmed by Carl Anderson in 1932. In 1948, Willis Lamb was measuring energy levels associated with the hydrogen atom, and when he realized they were slightly different from prediction, it turned out there were some contributions from the vacuum field that reconciled that issue. Another indication that the quantum vacuum can have classical measurements in a lab comes from Casimir‘s derivation of the Casimir Force in 1948. “Dr. Miley’s earlier talk mentioned Eric Allin Cornell who is the first gentleman to actually produce a Bose-Einstein Condensate is now researching at Rice University on the Casimir-Polder Force. He started off a recent talk by saying that ‘If the zero point field is not real, he wouldn’t be here talking about the results he was presenting'”. “What’s the Casimir Force? Thinking from a classical perspective, if you could put two conducting plates in a vacuum chamber with some distance between the two, and you were able to produce a perfect vacuum, as these plates get closer and closer, there’s going to be a point where the distance between the two, and it actually happens the whole time but the force doesn’t get measurable until you get extremely close, but as the two plates get closer and closer together, it precludes certain wave modes of photons and particles that cannot appear between the plates.” “So even though you may have a perfect vacuum on the outside, from a classical perspective, we think there’s no difference in the vacuum level between the two plates, but when you look at the quantum perspective, it is different, there is a negative pressure between the two plates.” “And this has been measured a number of times over the years”, continued Dr. White. “As we start to make more products that fit into this category, we’re starting to see more issues where the classical and quantum tend to overlap and we actually have to factor that into that design process. So there’re some scenarios where the size of these things can also incur some things like friction between surfaces that have to move relative to one another.” “So the quantum vacuum is not empty per se. Now we ask, how much energy is available in the quantum vacuum field to do something with?” The predicted energy density in quantum vacuum is given by an integral equation. But says Dr. White, “Although QED is one of the most successful theories, it’s also responsible for one of the worst predictions in physics. “When you compute this integral from zero to the Planck frequency, it calculates an extremely high energy density. But when we compare that predicted energy to the observed critical density in the cosmos$9.9 \hspace{0.2 mm} \text{x} \hspace{0.2 mm} 10^{-27}$kilograms per cubic meter, there’s a vast difference between these two, many many orders of magnitude.” “However, the difference between the predicted and observed values is not understood, so there’s some interesting things we can learn in that area,” he added. So is there a way to utilized this sea of virtual particles and photons to transfer momentum from a spacecraft to the vacuum? There’s been many ideas over the years: the vacuum sail, a type of ‘solar sail’ for the quantum vacuum; inertia control by altering the vacuum energy density and reducing total spacecraft mass, and then the focus of Dr. White’s interest, dynamic systems that make use of the Casimir Force to generate a net force. He described the dynamic Casimir force as “resulting from Unger radiation whereby an accelerated observer sees the effective temperature of the surrounding vacuum increase, there’s an equation that calculates how they perceive that, so that the vacuum actually takes on a higher temperature, and appears to be a warm photon bath.” “You may have heard of Hawking radiation”, he said. “If you have a black hole, and a pair of virtual particles is created right on the horizon, where one particle goes inside the horizon, and one particle goes away from the horizon, then the black hole’s total mass is actually reduced by one particle, because one of the particles went in and annihilated with something inside the black hole.” “The simplest mechanism to think about this from a practical application perspective would be through generating thrust by the use of vibrating mirrors, where the mirror would accelerate more in one direction than it would in the other.” The dynamic Casimir force was potentially observed in the lab in 2011 and the magnitude of thrust from a dynamic Casimir force has been derived quite a number of times in the literature, but it’s been found to be very small. “So while it’s theoretically possible”, says Dr. White, “it’s very small.” “Another way to think of this, is you have to leave wakes, a submarine doesn’t carry water with it, it uses a propeller to couple with a mechanism. Maybe overly simplistic but I think people can understand. I think that’s why Arthur C. Clarke talked about a quantum ramjet, just to help people draw analogies.” Are there ways we can increase the net force from this dynamic Casimir force? Dr. White summarized a few claims resulting from the work that he’s been doing at the Johnson Space Center: Claim 1 The observed vacuum fluctuation density based on cosmology is$10 ^{-26}$kilograms per cubic meter. This relationship here predicts, in the presence of conventional matter, we can increase the local vacuum fluctuation density as a result of that. “What this suggests is that with in the presence of a barium Type A capacitor, the vacuum field energy density is going to be in a slightly different state than what it would be otherwise. So this equation right here [see Figure 1 Equation 1], this is the free vacuum state, this is the local density of matter. And that’s what the altered vacuum state is.” This takes the vacuum fluctuation density up from$10^{-27}$kilograms per cubic meter to$10^{-15}$kilograms per cubic meter. “So you might be able to do something with that, but it’s still pretty hard.” With such tiny amounts vacuum fluctuation, how does Dr. White convince himself that this might have some validity as a power source? “Simply put”, he answered, “the reason this equation has some interest to me is that this can derive the Bohr radius from first principles. So I can go through and show that$5.29 \hspace{1 mm} \text{x} \hspace{1 mm} 10^{-11}$meters is a consequence of dark energy. So it’s an interesting finding. It’s either a pretty significant numerical coincidence, which does happen from time to time in physics, or it has some potential interest from a physical medium.” Claim 2 The energy density of the quantum vacuum can be amplified not only by acceleration but by changing acceleration and in turn, its subsequent derivative. This is an extension on the approach of the dynamic Casimir force. “This is the wave equation [see Figure 1 Equation 2] this comes from the Friedmann equation and then use the Unruh equation, you can get this wave equation, and what this wave equation says is that when you convert this from acceleration into potential, that a varying energy density will also have an impact on the local vacuum fluctuation energy density.” “Why do I have confidence that this might have some validity?” “We’ve got some test data with several different test articles that we have run within several different operating conditions, and the predicted thrust was reasonably close within a factor of 2.” Claim 3 “The altered state of the vacuum can be modeled quasi-classically as an electron-positron virtual plasma. From my plasma physics background we just use the tools of Magnetohydrodynamics MHD to predict the macroscopic behavior depending on how we implement things. And so this is a pictorial representation of that.” “Now, you can go look at cosmological data, you can also look at things down at the microscopic level and see if your claims can be proven or disproven without actually having to go into the lab.” “This interests me in that, we have shown the magnetic pressure from the electron rotating round the hydrogen nucleus exactly equals the thermal kinetic pressure if we claim that the altered state based on the equation that we just talked about, can be modeled as an electron -positron plasma.” “In a test article that we ran at 2 MHz and 4 MHz, the predicted force was very close to the observed force. We’ll be building a much larger test article, we’re trying to get to the 0.1 milli newton level of thrust, and we’ll be working on that over the next year.” How does all this apply to human spaceflight? “This quantum vacuum energy is centric to nuclear systems, whether its nuclear reactors or nuclear thermal rockets. With the specific force that we have with this type of system, since effectively you’re pushing off the vacuum, you don’t have to have large tanks; you get to push off the vacuum, and the vacuum needs to carry the momentum information for you, so we can have much heavier specific power systems, and still accomplish pretty significant missions because the specific force is so much higher.” “With this type of a thruster, if we could couple a 2MW reactor to the equivalent of 2MW of thruster capability we could do a Jovian mission, and this is a capture time, in 138 days, and 196 days for Saturn.” R. K. Obousy Project Icarus: Anti-Matter Catalyzed Fusion Propulsion for Interstellar Missions 3104.pdf with K. F. Long and T. Smith The last speaker was R. K. Obousy of Project Icarus, a non-profit group dedicated to designing an interstellar mission to the nearest star Alpha Centauri. Dr. Obousy’s talk was outlined in three sections: the physics of interstellar travel, Project Icarus a fusion based interstellar starship design study, and a new project of anti-matter catalyzed fusion. He began by articulating the main problem with interstellar travel: the distances involved. Voyager I, a spacecraft launched in 1977 designed to travel to the outer planets, is now traveling at about 38,000 mph at a distance of 116 AUs from Earth. With that speed, if Voyager was traveling to the nearest star Alpha Centauri, it would take on the order of 70,000 years to get there. “If you imagine Earth on the East coast of the US in NYC and Alpha Centauri on the West coast in San Francisco, then Voyager launched in 1977 has traveled only a single mile on that journey.” [Voyager from NASA] “What we want to accomplish is interstellar flight not in 70,000 year, but something closer to the timescale of a human lifetime about 70 years. So we need to increase our top speed by at least a factor of one thousand.” “The problem becomes apparent when we consider one of the simplest equations in rocket physics, the Tsiolkovsky rocket equation.” The Tsiolkovsky rocket equation gives the maximum change in rocket velocity as directly proportional to the exhaust velocity$\mathbf v_e$and the natural log of the ratio of initial total mass$\mathbf m_0$to the final total mass$\mathbf m_f$.$\mathbf \Delta \text{v} = \mathbf v_e \hspace{1 mm} \text{ln} \hspace{1 mm}(\frac{m_0}{m_f}) \hspace{10 mm}\text{Tsiolkovsky Rocket Equation}$“When you plug in the numbers for chemical propulsion fuel, a$\mathbf \Delta v$of ten percent the speed of light$3 \hspace{0.5 mm}\text{x}\hspace{0.5 mm} 10^{7}$meters per second (which is roughly what it would take to get to the nearest star in the timescales of a human lifetime), the specific impulse of chemical rocket fuel is on the order of about 450 seconds. When you plug in the numbers, you discover that you need more chemical rocket fuel than there is mass in the known universe. Needless to say, it’s impossible to engage in interstellar missions on timescales of a human lifetime using chemical propellants.” “However, there are other ways to liberate energy from matter. Once you go down into the sub-structure of the atom, and you liberate energy from the nucleus, then you can liberate much larger amounts of energy.” “Specific energy is the theoretical maximum amount of energy per unit mass that you can extract. For chemical energy, that’s on the order of 15 million Joules per kilogram. When you jump up to fission, you jump up by a factor of almost ten million, so pound for pound, you can liberate about a million more times energy than from chemical sources. About ten times more energy when you go to fusion, and about 100 times more energy than that when you go to matter-anti-matter reactions.” “So within the known laws of physics, there are ways that you can liberate far greater amounts of energy that you can then utilize for impulse purposes.” Project Icarus is one component of Icarus Interstellar [visit] which has a number of research avenues. Project Icarus was inspired by a famous interstellar study called Project Daedalus [visit] which ran between 1973 and 1978. Project Icarus has a four-fold purpose. 1. To motivate a new generation of scientists and inspire the next generation to get into this field. 2. To generate a lot of interest in the real-term prospects of an interstellar mission. 3. To design a credible probe for a mission that we could potentially do this century. 4. Provide an assessment of the maturity of fusion-based space propulsion. With a volunteer, international team, they want to design an unmanned probe capable of delivering useful information about another star system and any associated planetary bodies. It must use current or near-future technology, must reach stellar destination in as fast a time as possible – not exceeding a century and must be designed for a variety of target stars. They want to allow for deceleration in the target system as well. “We’ve got twenty research modules really encompassing the whole amalgam of what we believe you’d need to conduct an interstellar mission”, says Dr. Obousy. “Astronomical target, mission analysis, primary and secondary propulsion, fuel, navigation…the list goes on. We’ve demarcated the project into all the salient research regions. We apply academic rigor and are in a number peer-reviewed publications.” For the primary propulsion, they are looking at fusion to provide continuity with Project Daedalus. Within fusion, there are a number of different ways to accomplish propulsion, inertial confinement fusion, Polywell, magnetic target fusion, aneutronic fusion. PB11 which is valuable because of the fusion by-products are charged particles which can be channeled by nozzles. “So let’s say a little bit about anti-matter, first predicted by Paul Dirac in 1928. It’s a very mercurial form of matter. When it touches its matter component, it annihilates with perfect efficiency according to Einstein’s equation$E= m c^2\$.”

“We believe that for all known particles of matter, there corresponds an existing anti-particle. So for an electron, there’s an anti-electron or positron, for a proton, there’s an anti-proton. More fundamentally, it’s at the quark level, so protons consist of up and down quarks, so there’s anti-up and anti-down particles.”

“It’s not just science fiction. The positron was found in 1932, the anti-proton was discovered in 1955, and really the main issues with anti-matter are creation and storage.”

“We create incredibly small amounts of anti-matter each year, mostly in the CERN particle accelerator in Europe, about 1-10 nano-grams per year, at an estimated cost of 100 billion dollars per milligram. So it’s not cheap.”

“However I will say that the facilities where we create anti-matter, are not specifically designed to create anti-matter, they’re particle accelerators of which a nice by-product is you get anti-particles out. So I’d have to do an in depth research study but I would say you could probably push down that number by a significant factor if you constructed dedicated anti-matter factories.”

“There are a number of ways to store anti-matter. Penn State University has created a trap that can store 10 billion anti-protons for about a week. Certainly we haven’t mastered this technology, but we’re at a stage where our understanding of the technology is maturing and we’re beginning to create anti-particles, and we’re beginning to store anti-particles.”

“It seems that because anti-matter liberates such a huge amount of energy when it collides with its matter component, would it not be pertinent to study the possibilities for propulsion?”

“One of the first models was the Sanger rocket. In the Sanger rocket you collide electrons and positrons. The by-product of this is 511 keV gamma photons. The problem is most gamma rays radiate isotropically, and what you want to do is figure out some way to collimate that thrust. Sanger had this idea for an ultra-dense electron momentum transfer device, something along those lines.”

“The other possibility is to annihilate anti-protons. When protons and anti-protons collide, you get neutral pions, which are quite short-lived, they propagate for about a micrometer before decaying into gamma rays. You also get charged pions, again quite short-lived, they decay into muons and anti-muons, and they further decay into electrons and anti-electrons and electron neutrinos and muon neutrinos, and ultimately gamma rays.”

“But during that time when they exist for that short period as charged pions, you actually get 1.88 GeV of energy out, and about 64% of that is in the form of kinetic energy of the charged pions. If you’ve got these rapidly moving charged particles, you can utilize that for thrust via magnetic nozzles.”

Anti-matter energy has a lot of advantages over conventional fusion.

“The entire mass of National Ignition Facility NIF which uses lasers to ignite deuterium-tritium pellets is on the order of one hundred kilotons. It wouldn’t be feasible to transport 100 kilotons of hardware into space just to accomplish a fusion reaction. What’s great about anti-matter is that it’s an immensely efficient energy delivery packet. So an anti-proton beam offers 90 Megajoules per micro-gram.”

“Now you wouldn’t exactly power a rocket directly from matter-anti-matter annihilation because for an interstellar mission, you’d need quite a vast quantity. But what you could do is use very small quantities, on the order of about a micro-gram of anti-protons to actually deliver energy to, for example, a deuterium tritium pellet which would then fuse, and then you’d be able to utilize that for propulsive purposes.”

Dr. Obousy put up a slide containing a list of non-conventional technologies that the Project will look at to power their spacecraft to the nearest star. Cold fusion or LENR was not among them.

At the end of the talk, Professor Kim asked Dr. Obousy, “Why wasn’t cold fusion included in his list of breakthrough technologies that could contribute to the propulsion system?”

Dr. Obousy’s reply was “We haven’t decided as of yet, but that’s not something we’re actively looking at. But by all means, we certainly don’t have a complete list of all the different ways of accomplishing fusion, but perhaps we can begin a dialogue.”

Well, after he finished, and the Session was over, I moved to go up to him and tell him the good news on where he could find a possible source of electron-positrons creating gamma rays. But Professor Kim got to him first. After a while, they didn’t look like they were going to stop talking, so I walked up and handed the both of them a Cold Fusion Now sticker.

“Did you know that the E-Cat, the first commercial cold fusion energy generator on the market may make some 511 keV gammas? You might have a source there!”

Professor Kim added “And I know why he has 511 keV gammas!”

Dr. Obousy looked surprised, albeit happily, and somewhat bemused by his sticker.

Imagine. Hot-water boilers for a new Steam Age – on Pluto!

Cold Fusion Now!