Author Archives: Nanook

Lithium Iron-Disulfide Cells Non-Rechargeable?

Those of you familiar with lithium ion rechargeable batteries may wonder how these new lithium AA batteries work given that common rechargeable lithium batteries have a nominal voltage of 3 volts or greater depending upon chemistry.

The rechargeable lithium cells are usually lithium manganese. Lithium iron disulfide cells are a nominal 1.5 volt cell. These are sold as a higher capacity replacement for alkaline batteries. These batteries are mostly primary cells (non-rechargeable) but there are some versions that are rechargeable, however I do not know if there are any in actual production.

There are experimental high temperature versions of lithium iron disulfide cells that are being used in electric vehicles. These might be workable for commercial vehicles that are always operating but to keep a battery at the required temperatures (around 350 centigrade) while idle would be wasteful and consumers aren’t going to wait for their car battery to warm up before they drive somewhere. There is at least one company that has a ambient temperature rechargeable lithium iron disulfide cell with patents pending.

I’ve seen numerous references on the web to secondary cells but I have yet to find anyone actually selling a rechargeable lithium iron disulfide cell. The potential advantage of such a cell, in terms of power to weight ratio, would be substantial.

I’ve tried to find information on the web on exactly what it is that makes this particular chemistry non-rechargeable and what it is that does make the rechargeable varieties rechargeable and I’ve found nothing. So if you happen to know, please leave a comment. Any good references would be appreciated.

Moore’s Law and uProcessor Technology

For many years Moore’s law which held that the most economic per unit transistor density on an integrated circuit would double every two years held true. In fact, for much of that time it doubled every 18 months. Note that this is the most economic per transistor not the maximum physically possible.

Recently, Moore’s law has stalled. You may have noticed there hasn’t been a noticeable increase in CPU speeds for the last year or so. Manufacturers, unable to economically scale CPU speeds, have gone to multiple core CPU’s.

IBM has promised an 80 core multi-core CPU within five years. Parallel processing has it’s advantages and limitations. The human brain relies on massive parallel processing to overcome the speed limitations of individual neurons and when you consider the huge amount of information the human brain processes, it does quite well.

However, not all tasks lend themselves to parallel processing and those that do require special programming. The human brain gets around this problem by being largely self-programming. That is, neural nets self-train and essentially wire themselves and adjust weightings of individual connections to produce the right results.

So far we haven’t really got this capability in silicon except for some crude neural network chips that attempt to emulate (poorly) neurons and only a very small number of them at that.

For the foreseeable future, parallel processing computers will need to be programmed specifically to use those parallel processors. Some tasks, like encryption key breaking, weather prediction, graphics rendering, shaped charge explosives design, protein folding problems, many of these tasks do lend themselves well to parallel computing, but many other tasks require an intermediate answer to be obtained before the next step can proceed and these tasks do not lend themselves well to parallel processing. These tasks require a single instruction execution unit with a faster clock speed to gain performance.

Heat has become the limiting factor in increasing CPU clock speeds. For years, making components smaller allowed faster speeds because power was dissipated primarily during switching, so the faster transistors switched on or off, the more heat was lost.

As the size of the transistors were reduced, a point was reached where the gates of the transistors became so small that the leakage current through the gate became the predominate source of heat and shrinking components further no longer allowed faster operational speeds.

About 80% of the dissipated power comes from this gate leakage current with CPU’s made with current 65 nanometer line width technology. The gate insulator on these transistors is just five atoms wide.

Intel will be coming out with new CPU’s later this year based upon 45 nanometer line width technology. To address the problem of gate leaking, Intel will be substituting a compound made with hafnium for the current silicon dioxide gate insulator. In addition they are using a metal other than silicon (but they aren’t specific) for the gate structure. Combined, these two things will greatly reduce gate leakage and thus power demands and will allow faster speeds as a result of the reduced heat production.

Interestingly, IBM is also coming out with a new hafnium based gate insulating material for their transition to 45nm design line processors. It seems difficult to believe that Intel and IBM came up with the same solution independently. They announced this development independently on the same day, January 26, 2007, as competing developments. They also will be replacing the polysilicon gate with another metal compound.

Both Intel and IBM have working 45nm processor prototypes running Vista, MacOS and others to demonstrate the functionality of this technology.

AMD is taking a different approach although they may ultimately incorporate hafnium based gate insulators in later 45nm production. IBM and AMD are required to share some information as part of a former legal settlement.

AMD is experimenting with making the transistor gates out of nickel rather than silicon. This allows a thicker gate insulator to be used and still obtain the same control over current flow through the gate.

AMD is also experimenting with “strained silicon” gates. They embed a certain number of germanium atoms within the silicon lattice to stretch it and force the atoms to be farther apart. This allows electronics to move more freely resulting in greater conductivity, higher drive currents, and thus results in faster speeds, up to 20-25% faster. This work is being done in conjunction with IBM.

AMD is also experimenting with adding a silicon on insulator layer that reduces leaking by eliminating stray electron carriers from the chip.

Lastly, AMD is looking into various multi-gate FET transistors that may have some advantages over conventional single gate IGFET’s.

It appears that, although Moore’s law took a temporary break, it is back on track at least for another decade or so. Ultimately these new technologies are expected to also enable a move to 32nm and then later 24nm line width designs. Maybe then we can have accurate weather forecasts.

Abiotic Oil

I hate to harp on this subject because I really feel that for environmental reasons, we do need to shift from hydrocarbon fuels to clean and sustainable sources of energy.

However, the oil companies are doing everything in their power to suppress that transition and keeping the price of oil artificially high through various means such as wars in oil producing countries and pushing the “peak oil” hype.

Since the beginning there has been debate with respect to whether oil is of biological origin or not. Most geologists in the United States seem to be of the opinion that oil is of biological origin.

To anyone with any science knowledge, you know that biological entities do not create or destroy elements. You also know that hydrocarbons are compromised of hydrogen and carbon. All of the hydrocarbons that exist came from hydrogen and carbon that are primordial, whether or not they were cycled through biotic systems.

We also know from lab experiments that given carbon, water, and a few other ingredients present in the earth’s mantle and subject to the temperature and pressure present in the earth’s mantle that we get a mixture of hydrocarbons that agrees with what we pump out of the ground.

We know the raw ingredients are there. We know that subject to the conditions present in the mantle they combine into oil. So I’m hard pressed to understand the opposition to the abiotic oil theory other than vested economic interests wanting to keep supplies limited and by extension, prices high.

Another asinine argument I have heard is that the abiotic oil theory is based upon the fact that we see hydrocarbons on other planets in space. However, the argument is made, this doesn’t seem to be the case for the inner rocky planets. Understandably this is the case with mercury, it is so hot and so small that it would have had problems hanging onto it’s volatiles.

On Venus, while we may not be able to explore and drill for hydrocarbons, we do know that it has a CO2 atmosphere 100x as dense as Earth’s. Given that Earth’s presently got around 380 parts per million that means we’d need to burn about 30,000x as much hydrocarbons as we have so far to get as much CO2 in our atmosphere as Venus, or in other words plenty of reason to believe there are plenty of hydrocarbons left in the Earth. In short, hydrocarbons aren’t the problem, atmosphere is.

No doubt a portion of that carbon and hydrogen has been recycled through life forms and that’s why we do find oil when we drill into old sedimentary deposits, but that is secondary.

The Russians have proved it, by drilling through granite and basaltic capstone, below any sedimentary deposits, they have struck large quantities of oil and become the worlds second largest oil producer, and for a short while before the Yukos siezure, the worlds largest.

Now similar deposits have been tapped by the Russians in Viet Nam (White Tiger field) and Chinese in North Korea, and by a small company in Utah here in the United States.

Now I’ve got really mixed feelings about all of this because on the one hand, we’re not tapping any more atmospheric sources and we’re polluting those and altering our climate. On the other hand, as long as oil remains so highly profitable the oil companies are going to exert huge political influences and prevent alternative technologies from being implemented, however, I fear that if the price of oil fell to realistic levels people would also be reluctant to develop alternatives. Seems like catch-22 situation.

Zero Point Energy

I had an idea for producing thrust from zero point energy. Hopefully you are familiar with the Casimir effect. You place to parallel plates very close to each other:

You get a net attractive force between the plates because wavelengths of virtual energy that are longer than the spacing of the plates are absent between then but present outside and thus push the plates together. The shorter the distance between the plates, the more wavelengths are excluded and the stronger the force pushing them together.

Let’s suppose instead you don’t make the plates parallel:

This would make more force pressing on the plates where they are close together than where they are farther apart. Given that the force is applied at an angle relative to the plates, shouldn’t that make them then want to move in a net direction away from the point where they are closest together?

I’m wondering if you etched a bunch of these very tiny structures (since most energy is contained in the smallest wavelengths) into a chip if you could get a measurable thrust?

More on Mysterious Dust

There is more to the mysterious dust. Tree pollen makes up the bulk of it. Pollution blown over from China is also a component of this dust. The contribution from China includes dust from the Gobi desert kicked up by huge dust storms there, and soot and mercury from China’s 2500 coal fired electricity plants.

Deja Vu

I’ve heard so many bogus theories on Deja Vu that involve various brain functions, one part of the brain getting information before another, or information coming directly to one hemisphere of the brain and then delayed via the corpus collosum from the other hemisphere.

I’ve experienced Deja Vu many times and so I decided to test these theories. I had the radio on, a live talk show was in progress. All of the sudden I realized I was in a Deja Vu experience. I knew exactly what the announcer would say almost fifteen seconds in advance.

So, I spoke out loud what I knew the announcer would say. I heard my own voice come back and close to fifteen seconds later the announcer repeat exactly what I said, and this lasted almost a minute.

None of these “brain part” explanations can explain my hearing the voice what the announcer will say in advance of the announcer saying it, and it was a live show so it was not possible for me to have remembered a previous broadcast of the show.

I don’t know what the explanation of Deja Vu is, but I know what it’s not.

Lithium Battery Safety

Your cell phone, your laptop, and other modern devices with rechargeable batteries are most likely powered by lithium ion battery packs. Lithium ion batteries provide a higher energy density than competing cell technologies widely available today.

I’ve used a cell phone with a lithium ion battery for a number of years and until today I was never aware of the very serious and potentially dangerous limitation of this battery technology.

If you charge a lithium ion cell at temperatures below the freezing point of water, a metallic lithium plating builds up on the cells anode. If the battery is then subject to a sharp impact or rapid charge rate it can result in a thermal run-away condition that can result in an explosion or fire. Even if the battery does not go into thermal run-away self-destruct mode, the plating will still ruin the battery.

Do not charge your lithium ion battery if the temperature is below freezing!

Mysterious Dust

A mysterious dust coating much of the Puget Sound region has baffled many people. Some have said it had a “volcanic” quality, similar when Mt. St. Helens erupted. That speculation had me a bit nervous since I’ve had dreams of massive volcanic activity along the Cascades all of my life.

My eyes have been bothering me lot the last few weeks, gooped up, as has my nose and sinuses, that was another mystery.

Well, it turns out the mysterious white powder is tree pollen which explains my eyes being gooped up. That’s something that I usually get to deal with during tree pollen season but that’s also something that happens slightly later in the year. I hope this also means it gets over sooner and I’m not miserable through mid-June.

Meta Materials

It has long been known that very small bits of matter, less than a few tens of thousands of atoms, can exhibit properties quite different than bulk material. Only recently have we had the ability to create structures from materials that are substantially smaller than optical light wavelengths. We call these new materials nano-materials because they involve structures that are measured in nano-meters. Optical wavelengths encompass wavelengths from about 800nm for deep red, to 450nm for violet. Nano-materials have structures that may be considerably smaller than this.

Early applications of nano-materials mostly involved random assemblages of small structures, a good example being nano-particle based lithium ion batteries that can withstand much higher charge and discharge rates because of the high active surface areas made possible using nano-particle construction or new ultra-capacitors, capacitors with energy storage capacity rivaling batteries.

A newer class of material which makes use of the ordered arrangement of nanometer scale structures is now making the production of materials with very exotic properties. These materials are referred to as meta-materials because they possess properties that no natural material could possess.

So far the most interesting properties of meta-materials are their optical properties. Because it is possible to create structures that are much smaller than visible light, materials can be made that appear homogeneous to the incident radiation but have objects that appear to the incident radiation as artificial atoms with very unusual properties.

Materials made up of a series of harmonically coupled resonators can exhibit a negative index of refraction, they can cause light to bend the “wrong way”. With these materials it is possible to make a flat lens. What is more, it is possible to create a lens that exhibits super-resolution, the ability to resolve features that are smaller than the wavelength of the illuminating light.

Another feature of lenses created from these materials is that they do not have an axis like a conventional curved lens and as a result they can have an unlimited depth of field. This also has significant implications for photo lithography because presently when you expose a silicon wafer, near the center the image is sharp but it becomes increasingly blurry near the edges and so for example a wafer of CPU’s, those nearest the center will be the highest quality, and defects and quality decrease as you get to the edge. With these new lenses it will be possible for all of the wafer to be exposed with correct focus.

I believe we’re just seeing the tip of the iceberg and more exciting developments can be expected.


I am uncomfortable with the plans to attempt to create black holes in particle accelerators. It is believed, that if created, they will be safe because they will rapidly evaporate via Hawking radiation.

The idea behind Hawking radiation is basically this, at the event horizon pairs of virtual particles will pop into existence such that one goes inside the event horizon while the other goes out. Because the two can not re-unite within the time limit imposed by the Heisenberg uncertainty principle, the virtual particles become real, the one outside the horizon escapes and the theory has it that the black hole decreases in mass by a corresponding amount.

Now I do understand the argument, that the black hole has to decrease in mass in order for the conservation of mass to hold true, but I don’t understand how a virtual particle outside the event horizon becomes real with mass, and the one inside subtracts from the mass of the black hole.

Given that this theory has never been tested and it seems somewhat dicey at best, it seems less than wise to gamble the entire existence of our planet and everybody on it, on this not happening.

If Hawking radiation does not manifest, but black holes do, then instead of evaporating, the black hole will suck up any particle within it’s gravitational grasp, which for the first few nanoseconds might be few, but with every particle it sucks up it’s mass, and thus it’s gravitational pull, will increase, and very shortly thereafter we’ll have a tiny black hole orbiting the Sun where the Earth and every living thing upon it a few seconds earlier, previously was.

Sometimes I wonder if alien versions of this experiment aren’t the reason the SETI radio dial is so dark. Actually though, I know it’s not. The SETI protocol is designed in such a way that it is absolutely impossible to detect and confirm the presence of an alien radio signal unless that signal is intentionally beamed at the earth for several days continuously.

This is so because SETI protocol requires the signal being received repeatedly, but the distances involved are such that even with enormous power, a signal transmitted by a distant civilization will only be received here on earth if by chance a large directional antenna on their planet is aimed directly at a large directional antenna on our planet which is aimed directly at them.

Because of the mutual spin of both planets, the mutual orbits of both planets around their parent star, the motion of the stars within the galaxy relative to each other, there is zero chance of this happening accidentally. So the fact that we have no confirmed radio signals from extraterrestrial signals is no real surprise and doesn’t rule out the existence of those civilizations.

So maybe other civilizations have survived the black hole experiment, or maybe their civilizations wisely told their scientists, if you want to try this go try it on an uninhabited planet in the outer reaches of the solar system, not on our home planet.