Unified Field Theory and Gravitational Anomaly

By , September 23, 2009 9:58 PM

Unified Field Theory

In the middle of the 1800’s the first successful (classical)  field theory was developed by James Clerk Maxwell .
In 1820 Hans Christian Ørsted discovered that electric currents exerted forces on magnets.
While in 1831, Michael Faraday made the observation that time-varying magnetic fields could induce electric currents.

Until then, electricity and magnetism had been thought of as unrelated phenomena.

First Field Theory – 1800’s – James Clerk Maxwell proposed for electromagnetism,
Second Field Early – 20th century – Albert Einstein’s proposed general theory of relativity – dealing with gravitation.

The term unified field theory was coined by Einstein, who was attempting to prove that
electromagnetism and gravity were different manifestations of a single fundamental field thus The Unified Field Theory.

http://www.youtube.com/user/noonscience

When quantum theory entered the picture, the puzzle became more complex.

The theory of relativity explains the nature and behavior of all phenomena on the macroscopic level (things that are visible to the naked eye);

Quantum theory explains the nature and behavior of all phenomena on the microscopic (atomic and subatomic) level.
Perplexingly, however, the two theories are incompatible. Unconvinced that nature would prescribe totally different modes of behavior for phenomena that were simply scaled differently, Einstein sought a theory that would reconcile the two apparently irreconcilable theories that form the basis of modern physics.

Although electromagnetism and the strong and weak nuclear forces have long been explained by a single theory known as the standard model, gravitation does not fit into the equation.

Gage

Gravitational interaction: a long-range attractive interaction that acts on all particles with mass. The postulated exchange particle has been named the graviton.

Gravitational Anomaly

ss

Nokia Sync for Google Calendar and Google Contacts

By , September 22, 2009 3:01 AM

So you want to automatically sync your Nokia Symbian S60 with Google Calendar and Contacts.

On the Nokia Phone

  1. Open the web browser and go to:
  1. In the browser go to downloads and select Save to “Install Folder”
    Then go to Application manager and install it.
    It will say “You have no mail for exchange profile Create profile?”   YESnokia sync for google

On the Nokia Phone

Enter Email: xxxxx@gmail.comgysnc3_thumb
Domain: google
Username: xxxxx@gmail.com
Password: your password
Access Point: what access point you want it to automatically use (ps this may cost)

Retrieving Exchange Server Name …..

Website has sent a certificate with different website ….. accept permanently…. yes

Now set “Exchange Server” to be m.google.com
May be a good idea to turn off sync while roaming
leave everything else unchanged.

Set only Calendar and Contacts In the Sync Options say no to tasks and no to email

Go to Google Calendargsync2_thumb
Go to your Dashboard > Service Settings > Mobile > enable Google Sync
However if you can’t find it don’t be surprised, you may not need to do this step any more.

Nokia Phonegsync4_thumb
Sync

And good Luck

Space navigation

By , September 15, 2009 11:06 PM

Spacecraft go very long distances. Spacecraft have inertia, which means that they will keep going in the path they are in unless something changes that. If that path is off by even a tiny bit, they will keep going in that path, getting more and more off course, until they are far from their intended course. Course Correction LOCATION In order to know where a ship is, NASA needs to know two things: how far it is from Earth and its location in space. Generally, NASA uses the downlink, or radio signal from a spacecraft to a radio telescope in the DSN, to tell where it is. The distance between Earth and the ship is measured by sending up a radio signal from Earth with a time code on it. The spacecraft “bounces” back the signal, and people on the ground can see how long it took to travel from Earth to the ship and back. Since all radio waves travel at the speed of light, scientists can look at how long it took for the signal to make it to the ship and back and figure out the distance it traveled. The angle that the radiotelescope is pointing when it receives the signal tells the direction of the ship. A more precise way of measuring uses two radio telescopes. When a ship is in space, it sends a signal back to Earth. Three times a day, this signal can be received by two different DSN radio telescopes at once. They can compare how far the ship is from each signal. They then get the distance to a known object in space that doesn’t change its location, like a pulsar, (pulsing star), and from the three locations, (two telescopes and a pulsar) they can use a technique called triangulation to get the ship’s location. Three Point Camera TPC XYZ axis cameras with star mapping charts Some spacecraft, like DS1, can use asteroids and other objects in space to figure out where they are. Using a process called Optical Navigation or OpNav, pictures are taken of particular asteroids. The asteroids’ location relative to the spacecraft are used to determine position, and the position is compared to where the ship should be. At that point the ship can do a course correction. OpNav needs at least three objects to compare and uses triangulation to figure out a ship’s location.

Propulsion Systems

By , September 14, 2009 3:23 AM

Propulsion Systems

Rad Hardening

By , September 14, 2009 3:23 AM

When your computer behaves erratically, mauls your data, or just “crashes” completely, it can be frustrating. But for an astronaut trusting a computer to run navigation and life-support systems, computer glitches could be fatal.

Unfortunately, the radiation that pervades space can trigger such glitches. When high-speed particles, such as cosmic rays, collide with the microscopic circuitry of computer chips, they can cause chips to make errors. If those errors send the spacecraft flying off in the wrong direction or disrupt the life-support system, it could be bad news.

To ensure safety, most space missions use radiation hardened computer chips. “Rad-hard” chips are unlike ordinary chips in many ways. For example, they contain extra transistors that take more energy to switch on and off. Cosmic rays can’t trigger them so easily. Rad-hard chips continue to do accurate calculations when ordinary chips might “glitch.”

NASA relies almost exclusively on these extra-durable chips to make computers space-worthy. But these custom-made chips have some downsides: They’re expensive, power hungry, and slow — as much as 10 times slower than an equivalent CPU in a modern consumer desktop PC.

With NASA sending people back to the moon and on to Mars–see the Vision for Space Exploration–mission planners would love to give their spacecraft more computing horsepower.

Having more computing power onboard would help spacecraft conserve one of their most limited resources: bandwidth. The bandwidth available for beaming data back to Earth is often a bottleneck, with transmission speeds even slower than old dial-up modems. If the reams of raw data gathered by the spacecraft’s sensors could be “crunched” onboard, scientists could beam back just the results, which would take much less bandwidth.

Objects, particularly spacecraft structures, antennas, solar arrays and other spacecraft equipment,  are shielded against damage from momentary exposure to high energy electromagnetic radiation in the form of high energy optical (laser) radiation or nuclear radiation by a radiation barrier or shield constructed of fibrous silica refractory composite material like that used for the heat shield tiles on the shuttle spacecraft.

Major radiation damage sources

Typical sources of exposure of electronics to ionizing radiation are solar wind and the Van Allen radiation belts for satellites, nuclear reactors in power plants for sensors and control circuits, residual radiation from isotopes in chip packaging materials, cosmic radiation for both high-altitude airplanes and satellites, and nuclear explosions for potentially all military and civilian electronics.

  • Cosmic rays come from all directions and consist of approx. 85% protons, 14% alpha particles, and 1% heavy ions, together with ultraviolet radiation and x-rays. Most effects are caused by particles with energies between 108 and 2*1010 eV, though there are even particles with energies up to 1020 eV. The atmosphere filters most of these, so they are primarily a concern for high-altitude applications like stratospheric jets and satellites.
  • Solar particle events come from the direction of the sun and consist of a large flux of high-energy (several GeV) protons and heavy ions, again accompanied with UV and x-ray radiation. They cause a scale of problems for satellites, ranging from radiation damage to loss of altitude by heating up the upper regions of the atmosphere, causing them to raise up, and decelerating the low-orbit satellites by friction.
  • Van Allen radiation belts contain electrons (up to about 10 MeV) and protons (up to 100s MeV) trapped in the geomagnetic field. The particle flux in the regions farther from the Earth can vary wildly depending on the actual conditions of the sun and the magnetosphere. Due to their position they pose a concern for satellites.
  • Secondary particles result from interaction of other kinds of radiation with structures around the electronic devices.
  • Chip packaging materials were an insidious source of radiation that was found to be causing soft errors in new DRAM chips in the 1970s. Traces of radioactive elements in the packaging of the chips were producing alpha particles, which were then occasionally discharging some of the capacitors used to store the DRAM data bits. These effects have been reduced today by using purer packaging materials, and employing error-correcting codes to detect and often correct DRAM errors.

Thomas Challenger Thomas Challenger