There is more to the mysterious dust. Tree pollen makes up the bulk of it. Pollution blown over from China is also a component of this dust. The contribution from China includes dust from the Gobi desert kicked up by huge dust storms there, and soot and mercury from China’s 2500 coal fired electricity plants.
I’ve heard so many bogus theories on Deja Vu that involve various brain functions, one part of the brain getting information before another, or information coming directly to one hemisphere of the brain and then delayed via the corpus collosum from the other hemisphere.
I’ve experienced Deja Vu many times and so I decided to test these theories. I had the radio on, a live talk show was in progress. All of the sudden I realized I was in a Deja Vu experience. I knew exactly what the announcer would say almost fifteen seconds in advance.
So, I spoke out loud what I knew the announcer would say. I heard my own voice come back and close to fifteen seconds later the announcer repeat exactly what I said, and this lasted almost a minute.
None of these “brain part” explanations can explain my hearing the voice what the announcer will say in advance of the announcer saying it, and it was a live show so it was not possible for me to have remembered a previous broadcast of the show.
I don’t know what the explanation of Deja Vu is, but I know what it’s not.
Your cell phone, your laptop, and other modern devices with rechargeable batteries are most likely powered by lithium ion battery packs. Lithium ion batteries provide a higher energy density than competing cell technologies widely available today.
I’ve used a cell phone with a lithium ion battery for a number of years and until today I was never aware of the very serious and potentially dangerous limitation of this battery technology.
If you charge a lithium ion cell at temperatures below the freezing point of water, a metallic lithium plating builds up on the cells anode. If the battery is then subject to a sharp impact or rapid charge rate it can result in a thermal run-away condition that can result in an explosion or fire. Even if the battery does not go into thermal run-away self-destruct mode, the plating will still ruin the battery.
Do not charge your lithium ion battery if the temperature is below freezing!
A mysterious dust coating much of the Puget Sound region has baffled many people. Some have said it had a “volcanic” quality, similar when Mt. St. Helens erupted. That speculation had me a bit nervous since I’ve had dreams of massive volcanic activity along the Cascades all of my life.
My eyes have been bothering me lot the last few weeks, gooped up, as has my nose and sinuses, that was another mystery.
Well, it turns out the mysterious white powder is tree pollen which explains my eyes being gooped up. That’s something that I usually get to deal with during tree pollen season but that’s also something that happens slightly later in the year. I hope this also means it gets over sooner and I’m not miserable through mid-June.
It has long been known that very small bits of matter, less than a few tens of thousands of atoms, can exhibit properties quite different than bulk material. Only recently have we had the ability to create structures from materials that are substantially smaller than optical light wavelengths. We call these new materials nano-materials because they involve structures that are measured in nano-meters. Optical wavelengths encompass wavelengths from about 800nm for deep red, to 450nm for violet. Nano-materials have structures that may be considerably smaller than this.
Early applications of nano-materials mostly involved random assemblages of small structures, a good example being nano-particle based lithium ion batteries that can withstand much higher charge and discharge rates because of the high active surface areas made possible using nano-particle construction or new ultra-capacitors, capacitors with energy storage capacity rivaling batteries.
A newer class of material which makes use of the ordered arrangement of nanometer scale structures is now making the production of materials with very exotic properties. These materials are referred to as meta-materials because they possess properties that no natural material could possess.
So far the most interesting properties of meta-materials are their optical properties. Because it is possible to create structures that are much smaller than visible light, materials can be made that appear homogeneous to the incident radiation but have objects that appear to the incident radiation as artificial atoms with very unusual properties.
Materials made up of a series of harmonically coupled resonators can exhibit a negative index of refraction, they can cause light to bend the “wrong way”. With these materials it is possible to make a flat lens. What is more, it is possible to create a lens that exhibits super-resolution, the ability to resolve features that are smaller than the wavelength of the illuminating light.
Another feature of lenses created from these materials is that they do not have an axis like a conventional curved lens and as a result they can have an unlimited depth of field. This also has significant implications for photo lithography because presently when you expose a silicon wafer, near the center the image is sharp but it becomes increasingly blurry near the edges and so for example a wafer of CPU’s, those nearest the center will be the highest quality, and defects and quality decrease as you get to the edge. With these new lenses it will be possible for all of the wafer to be exposed with correct focus.
I believe we’re just seeing the tip of the iceberg and more exciting developments can be expected.
I am uncomfortable with the plans to attempt to create black holes in particle accelerators. It is believed, that if created, they will be safe because they will rapidly evaporate via Hawking radiation.
The idea behind Hawking radiation is basically this, at the event horizon pairs of virtual particles will pop into existence such that one goes inside the event horizon while the other goes out. Because the two can not re-unite within the time limit imposed by the Heisenberg uncertainty principle, the virtual particles become real, the one outside the horizon escapes and the theory has it that the black hole decreases in mass by a corresponding amount.
Now I do understand the argument, that the black hole has to decrease in mass in order for the conservation of mass to hold true, but I don’t understand how a virtual particle outside the event horizon becomes real with mass, and the one inside subtracts from the mass of the black hole.
Given that this theory has never been tested and it seems somewhat dicey at best, it seems less than wise to gamble the entire existence of our planet and everybody on it, on this not happening.
If Hawking radiation does not manifest, but black holes do, then instead of evaporating, the black hole will suck up any particle within it’s gravitational grasp, which for the first few nanoseconds might be few, but with every particle it sucks up it’s mass, and thus it’s gravitational pull, will increase, and very shortly thereafter we’ll have a tiny black hole orbiting the Sun where the Earth and every living thing upon it a few seconds earlier, previously was.
Sometimes I wonder if alien versions of this experiment aren’t the reason the SETI radio dial is so dark. Actually though, I know it’s not. The SETI protocol is designed in such a way that it is absolutely impossible to detect and confirm the presence of an alien radio signal unless that signal is intentionally beamed at the earth for several days continuously.
This is so because SETI protocol requires the signal being received repeatedly, but the distances involved are such that even with enormous power, a signal transmitted by a distant civilization will only be received here on earth if by chance a large directional antenna on their planet is aimed directly at a large directional antenna on our planet which is aimed directly at them.
Because of the mutual spin of both planets, the mutual orbits of both planets around their parent star, the motion of the stars within the galaxy relative to each other, there is zero chance of this happening accidentally. So the fact that we have no confirmed radio signals from extraterrestrial signals is no real surprise and doesn’t rule out the existence of those civilizations.
So maybe other civilizations have survived the black hole experiment, or maybe their civilizations wisely told their scientists, if you want to try this go try it on an uninhabited planet in the outer reaches of the solar system, not on our home planet.
See this article regarding a torsion balance being used to test inverse square law, and the results were that down to distances of 55 nanometers the inverse square law holds and there is no indication of the existence of dark energy.
I am not surprised by these results. I’ve been of the opinion that Dark Matter, Dark Energy, Inflation, Cosmological Constant, and Dark Energy are all just fudge factors to try to make “the big bang theory” conform to observations. It’s like using a complicated lens to make a square peg look round so you can explain how it must have fit into the round hole.
Measurements of Voyager I and II showed an anomalous reduction in deceleration as they left the solar system. Because the Sun’s gravity acts upon them, they are expected to slowly decelerate as they leave the solar system, and they are but not quite as quickly as expected.
The mysterious force that caused this was attributed to yet another cosmological fudge factor, dark energy.
I suspect there may be a much more mundane explanation. To understand how the velocity of these craft is measured, one has to be familiar with Doppler shift. When a train blows it’s whistle as it approaches you and continues as it passes then heads away, at first you hear the whistle at a higher pitch, then as it moves past you and away, a lower pitch. The same is true for electromagnetic radiation.
The velocity of these craft is measured by measuring the Doppler shift of their transmitters. The transmitters are designed to be extremely stable to allow Doppler shift to be measured precisely.
In order to measure the Doppler shift precisely, you have to measure the frequency of the received signal precisely. This means you have to measure how many cycles occur in a specific period of time. This in turn requires that you can very accurately measure out a segment of time. That in turn requires a highly accurate clock.
The United States maintains a time standard and provides the reference to the world from the National Institute of Science and Technology, formerly the National Bureau of Standards. There, they have highly accurate atomic clocks clicking off the nanoseconds.
NASA, in order measure frequencies accurately as well as time launches and other time critical operations precisely, has to have accurate clocks, they have their own time reference.
On the other side of the country, NewNet, an IRC network I founded in 1995, was trying to get a couple of servers to link up. IRC uses a protocol known as Time Stamp Protocol, to determine after a server split, who owns a particular nick or channel if there is a conflict.
In other words, let’s say you have two servers, A and B on a network and normally they communicate in real time. User “MyNick” connects to server A and is talking to people. Then a split occurs (server A temporarily looses communications with server B) and during that time another user logs into server B with the nick “MyNick”. Now, the split is resolved, both servers resume communications and there are two people on the network with the same nick. The servers use Time Stamp Protocol to determine which user had the Nick first and force the other user to change his nick.
In order for Time Stamp Protocol, both servers much agree on what time it is. This requires that the servers be synchronized to some accurate external time source. If for some reason the clocks on two servers do not agree, they will not link.
So here I am working with another site trying to link their server to mine and they will not connect. The admin of the other server says to me, “Your clock must be off because mine is sync’d to NASA”. I say to him, “Mine is sync’d to NIST so it must be yours that is off.”
We both test our servers and find that indeed our servers clocks do agree with the sources we sync’d them to. Then I checked NASA’s time server against NIST’s, there was five seconds difference.
Now getting back to Voyager I and II… If the clock you are using to measure the received frequencies of their transmitters is in accurate, then those received frequency measurements will be inaccurate. If the received measurements are accurate, then the velocities calculated to be causing the received Doppler shift is inaccurate.
Maybe it’s just me but before I would introduce a whole new fudge factor into cosmology, I’d check the clocks. Apparently that hasn’t occurred to NASA however, and I guess this explains why shuttles explode, less than half the Mars missions actually make it to Mars, etc.
There is something I haven’t been able to get regarding Hawking radiation and I think the scientific community, by trying to create black holes in a particle accelerator, might be making a serious mistake. Hawking radiation isn’t a proved phenomena, it is only theoretical, and if it turns out not to exist, or not to work as expected, then we’re potentially in a world of hurt.
Here is what I don’t get about it. Hawking radiation works like this. A virtual particle pair forms right at the event horizon. One particle in the pair is sucked into the black hole, the other escapes and becomes real. This results in radiation from the black hole and theory has it decreases the mass of the black hole.
There seems to me to be some assumptions here that I’m not so comfortable with. The first is that if a virtual particle in a virtual particle/anti-particle pair can’t unite with it’s mate, it becomes real. The second is that the other virtual particle, being sucked into the black hole, will somehow decrease the mass of the black hole.
I understand the thinking, conservation of mass, in order for that to be upheld, since radiation, matter / energy is “leaving” the black hole, it must decrease in mass. But nothing actually “left” the black hole, what left was a virtual particle created at the event horizon.
So, scientists are counting on these tiny black holes they hope to create in a particle accelerator to evaporate via Hawking radiation. But what if one of these assumptions is wrong? They don’t get smaller in mass or Hawking radiation doesn’t happen? Then they’re going to keep sucking up surrounding matter and growing in mass until the whole planet is sucked in and we’re all dead. This sounds like a pretty high risk scientific experiment to me. The kind of experiment that if we have to do it we should be doing it somewhere in deep space.
I don’t know if it’s true or not, but I’ve read that when scientists set of the first hydrogen bomb, they really weren’t sure if it wouldn’t detonate all the hydrogen in the water vapor in the atmosphere. We’re all here, so apparently it didn’t. I’m all for the advancement of science, but I question the wisdom of risking the entire human race on these types of experiments.
I have been advocating building advanced nuclear reactors capable of burning actinides to eliminate long term radioactive waste rather than attempting to store it in a national repository for 20,000 years.
Now, a new problem emerges. It has been found that radiation given off by radioactive waste materials breaks down minerals containing them much faster than originally believed. This makes long term storage essentially not viable. Glassification of waste won’t work for the long term because the alpha particles emitted by decaying radioactive elements break it down in just a few hundred years.
Follow this link for an article that details this process. This means that the radioactive waste will not be contained at the site for more than a few hundred years. If we store radioactive waste instead of destroying it, we are creating a huge mess for future generations.
Building actinide burning reactors to fission actinides not only will eliminate long-term radioactive waste leaving only fission products that will be save in several hundred years rather than 20,000, it will also provide 20-30x as much energy as the initial Uranium or Plutonium did when it was first used in a fission reactor. Given our energy situation this also is a good reason to build these.
If we don’t build them, then commercial reactors will continue to fission Uranium and generate Plutonium which is the most problematic element in nuclear waste. Other actinides, elements heavier than Uranium, also are long term waste problems. Plutonium-239 can be used as fuel in conventional thermal reactors (neutrons moderated to thermal speeds), but most of the other actinides can only be fissioned by fast neutrons, and this is why special reactors are needed to destroy them.
Now to be sure there are safety issues associated with fast flux reactors, and in general maintaining stability is more difficult than with a thermal reactor. This is because in a thermal reactor, neutrons have to be moderated (slowed) before they can be absorbed by another nucleus and cause a fission. Thus there is a built-in delay that limits the rate a reaction can ramp up. This built-in delay is not present in a fast-flux design.
There are other methods of providing stability in a fast-flux design. Fast flux reactors generally have negative temperature coefficients. That is, as the temperature rises, the reaction rate slows. This has the effect of providing negative feedback on the reaction rate thus stabilizing it.
However, we can mitigate these dangers to a large degree by building a fast-flux reactor form in the Yucca Mountain facility intended for waste storage.
The bigger problems aren’t safety, they’re economic. It is simply cheaper to build conventional thermal reactors and keep running Uranium through them one-pass and generating huge amounts of waste. Uranium is cheap and since the waste isn’t being dealt with, nobody is presently bearing that expense in real terms.
If we are realistic about expense, we must build these reactors and burn these actinide wastes because there simply is no safe way to store them long term.