Follow eufisica

Follow eufisica

Showing posts with label 2012. Show all posts
Showing posts with label 2012. Show all posts

Monday, December 31, 2012

"The Face of Creation" - Higgs remix

Time to say goodbye to 2012 with the biggest scientific new this year.
I wish you a happy 2013.

Sunday, December 30, 2012

The Universe - Brian Cox Lecture


In this lecture, Brian Cox explains how the Universe was created.

Monday, December 24, 2012

Merry Xmas

Now is time for Christmas... with Physics:
Credit: blueglass.com
Don't forget to follow me in facebook or google+ @ +eufisica
Best of the best to all of you


Monday, December 17, 2012

Pour Toutaties

NASA Radar Images Asteroid Toutatis

This 64-frame movie of asteroid Toutatis was generated from data by Goldstone's Solar System Radar on Dec. 12 and 13, 2012. In the movie clips, the rotation of the asteroid appears faster than it occurs in nature.
Credit and download @NASA

Building the next collider - by Nature Video

The Large Hadron Collider (LHC) is great for the discovery of particles, bit it isn't so precise.
So, scientist are thinking about this precision and it can be possible with a linear collider.
The International Linear Collider could be the next collider. The collisions will be between electrons and protons. The only problem is the global crisis.

Friday, December 7, 2012

Suit to walk on Mars

Scientist are working in a suit to walk on Mars.
There is a lot of information about a manned mission to mars on Wikipedia.
Here is the video from NBCNews about the suit:


Visit NBCNews.com for breaking news, world news, and news about the economy

Why Are So Many Sun Grazing Comets Being Discovered? | NASA Space Science



comet is an icy small Solar System body (SSSB) that, when close enough to the Sun, displays a visible coma (a thin, fuzzy, temporary atmosphere) and sometimes also a tail. These phenomena are both due to the effects of solar radiation and the solar wind upon the nucleus of the comet. Comet nuclei range from a few hundred meters to tens of kilometers across and are composed of loose collections of ice, dust, and small rocky particles. Comets have been observed since ancient times and have traditionally been considered bad omens.

font: wikipedia

Thursday, December 6, 2012

GRAIL's Gravity Tour of the Moon

Variations in the lunar gravity field
This image shows the variations in the lunar gravity field as measured by NASA's Gravity Recovery and Interior Laboratory (GRAIL) during the primary mapping mission from March to May 2012.
Credit: 
NASA/JPL-Caltech/MIT/GSFC

The gravitational field of the Moon has been determined by the tracking of radio signals emitted by orbiting spacecraft. The principle used depends on the Doppler effect, whereby the line-of-sight spacecraft acceleration can be measured by small shifts in frequency of the radio signal, and the measurement of the distance from the spacecraft to a station on Earth. Since the gravitational field of the Moon affects the orbit of a spacecraft, it is possible to use these tracking data to invert for gravity anomalies. However, because of the Moon's synchronous rotation it is not possible to track spacecraft much over the limbs of the Moon, and the far-side gravity field is thus only poorly characterized. The gravitational acceleration on the surface of the Moon is 1.6249 m/s2, about 16.7% that on Earth's surface (it means 1/6 of Earth gravity). Over the entire surface, the gravity variation is about is ~0.0253 m/s2 (1.6% of the gravity acceleration). Because weight is directly dependent upon gravitational acceleration, things on the Moon will weigh only 16.7% of what they weigh on the Earth.
Gravity acceleration at the surface of the Moon in m/s2. Near side on the left, far side on the right. Map from Lunar Gravity Model 2011
The major characteristic of the Moon's gravitational field is the presence of mascons, which are large positive gravity anomalies associated with some of the giant impact basins. These anomalies greatly influence the orbit of spacecraft about the Moon, and an accurate gravitational model is necessary in the planning of both manned and unmanned missions. They were initially discovered by the analysis of Lunar Orbiter tracking data, since navigation tests prior to the Apollo program experienced positioning errors much larger than mission specifications.
The origin of mascons are in part due to the presence of dense mare basaltic lava flows that fill some of the impact basins. However, lava flows by themselves cannot explain the entirety of the gravitational variations, and uplift of the crust-mantle interface is required as well. Based on Lunar Prospector gravitational models, it has been suggested that some mascons exist that do not show evidence for mare basaltic volcanism. The huge expanse of mare basaltic volcanism associated with Oceanus Procellarum does not possess a positive gravity anomaly.
fontWikipedia



This movie shows the variations in the lunar gravity field as measured by NASA's Gravity Recovery and Interior Laboratory (GRAIL) during the primary mapping mission from March to May 2012. Very precise microwave measurements between two spacecraft, named Ebb and Flow, were used to map gravity with high precision and high spatial resolution. The field shown resolves blocks on the surface of about 12 miles (20 kilometers) and measurements are three to five orders of magnitude improved over previous data. Red corresponds to mass excesses and blue corresponds to mass deficiencies. The map shows more small-scale detail on the far side of the moon compared to the nearside because the far side has many more small craters. Image credit: NASA/JPL-Caltech/MIT/GSFC

Monday, December 3, 2012

NASA's Voyager 1 spacecraft has entered a new region

Voyager 1 encounters new region in deep space, NASA says
Credit: NASA/JPL-Caltech/The Johns Hopkins University Applied Physics Laboratory
(Phys.org)—NASA's Voyager 1 spacecraft has entered a new region at the far reaches of our solar system that scientists feel is the final area the spacecraft has to cross before reaching interstellar space.

Read more at: http://phys.org/news/2012-12-voyager-encounters-region-deep-space.html#jCp

Friday, November 30, 2012

New type of heating for ITER


In a tokamak, blanket modules coat the inside of the chamber and directly face the hot plasma. In ITER, certain modules will be used to test tritium breeding concepts. Photo: Tore Supra Tokamak, CEA Cadarache. (Click to view larger version...)(Phys.org)—Tests for the heating that is to bring the plasma of the ITER international fusion test reactor to a temperature of many million degrees can go ahead from today: After three years of construction, Max Planck Institute for Plasma Physics (IPP) at Garching bei München has officially commissioned the ELISE test rig – the world's largest device of its kind and part of a four-million euro research contract of the "Fusion for Energy" European ITER Agency. Corepiece of the device is an innovative high-frequency ion source developed at IPP. On the ELISE test rig it will now be adapted to the high requirements of ITER.
ITER (originally an acronym of International Thermonuclear Experimental Reactor) is an international nuclear fusion research and engineering project, which is currently building the world's largest and most advanced experimental tokamak nuclear fusion reactor at the Cadarache facility in the south of France.
More info here: www.iter.org
Read more articles here: ITER

Friday, November 16, 2012

Heisenberg's Uncertainty Principle


Image from the article
"Quantum computers could overturn Heisenberg’s uncertainty principle".
Credit: io9
In quantum mechanics, the uncertainty principle is any of a variety of mathematical inequalities asserting a fundamental limit to the precision with which certain pairs of physical properties of a particle, such as position x and momentum p, can be known simultaneously. The more precisely the position of some particle is determined, the less precisely its momentum can be known, and vice versa. The original heuristic argument that such a limit should exist was given by Werner Heisenberg in 1927, after whom it is sometimes named, as the Heisenberg principle. A more formal inequality relating the standard deviation of position σx and the standard deviation of momentum σp was derived by Earle Hesse Kennard later that year (and independently by Hermann Weyl in 1928),
 \sigma_{x}\sigma_{p} \geq \frac{\hbar}{2},
where ħ is the reduced Planck constant.
Graphical interpretation of the Uncertainty Principle. Credit: hiperphysics
The first formulation of the uncertainty principle. In its present form it is an epistemological principle, since it limits what we can know about the electron. From "elementary formulae of the Compton effect" Heisenberg estimated the ‘imprecisions’ to be of the order
δpδq ∼ h               (1)

The first mathematically exact formulation of the uncertainty relations is due to Kennard. He proved in 1927 the theorem that for all normalized state vectors |ψ> the following inequality holds:
Δψp Δψq ≥ ℏ/2       (2)

where, Δψp and Δψq are standard deviations of position and momentum in the state vector |ψ>.
Since the above inequalities have the virtue of being exact and general, in contrast to Heisenberg's original semi-quantitative formulation, it is tempting to regard them as the exact counterpart of Heisenberg's relation (1). Indeed, such was Heisenberg's own view. In his Chicago Lectures (Heisenberg 1930, pp. 15-19), he presented Kennard's derivation of relation (2) and claimed that "this proof does not differ at all in mathematical content" from the semi-quantitative argument he had presented earlier, the only difference being that now "the proof is carried through exactly".
So, the above inequalities as showing that the formalism is consistent with Heisenberg's empirical principle.
This situation is similar to that arising in other theories of principle where one often finds that, next to an empirical principle, the formalism also provides a corresponding theorem. And similarly, this situation should not, by itself, cast doubt on the question whether Heisenberg's relation can be regarded as a principle of quantum mechanics.
There is a second notable difference between (1) and (2). Heisenberg did not give a general definition for the ‘uncertainties’ δp and δq. The most definite remark he made about them was that they could be taken as "something like the mean error". In the discussions of thought experiments, he and Bohr would always quantify uncertainties on a case-to-case basis by choosing some parameters which happened to be relevant to the experiment at hand.

Sunday, November 11, 2012

Length vs Speed

I don't know who made this graphic, but it's a great way to compare length vs speed and the Physics that can be applied.

Gradient Sun

The images in this video shows an unfiltered image from the sun next to one that has been processed using a gradient filter. Note how the coronal loops are sharp and defined, making them all the more easy to study. On the other hand, gradients also make great art. Watch the video to see how the sharp loops on the sun next to the more fuzzy areas in the lower solar atmosphere provide a dazzling show.

Thursday, November 8, 2012

Discover of X-Rays


Today we celebrate the discover of the X-Ray an important part of the electromagnetic spectrum.
I posted in this blog a several posts about radiation, with focus in the X-Rays. Now, Let's find out who discovered this radiation and how.

Wilhelm Conrad Röntgen (27 March 1845 – 10 February 1923) was a German physicist, who, on 8 November 1895, produced and detected electromagnetic radiation in a wavelength range today known as X-rays or Röntgen rays, an achievement that earned him the first Nobel Prize in Physics in 1901. His experiments involved the passing of electric current through gases at extremely low pressure. On November 8, 1895 while he was experimenting, he observed that certain rays were emitted during the passing of the current through discharge tube. His experiment that involved working in a totally dark room with a well covered discharge tube resulted in the emission of rays which illuminated a barium platinocyanide covered screen. The screen became fluorescent even though it was placed in the path of the rays, two meters away from discharge tube.


earlier x-rays

He continued his experiments using photographic plate to capture the image of various objects of random thickness placed in the path of the rays. He generated the very first “roentgenogram” by developing the image of his wife’s hand and analyzed the variable transparency as showed by her bones, flesh and her wedding ring. Based on his subsequent research and experiments, he declared that X-ray beams are produced by the impact of cathode rays on material objects.

Ultra efficient solar cell

Ben-Gurion University develops side-illuminated ultra-efficient solar cell designs

Researchers at Ben-Gurion University of the Negev (BGU) have developed a radically new design for a concentrator solar cell that, when irradiated from the side, generates solar conversion efficiencies which rival, and may eventually surpass, the most ultra-efficient photovoltaics.

New explanation for polar wandering

Modelling palaeomagnetically inferred TPW during the Neoproterozoic.
Rodinian palaeogeographic configuration before the pair of large-amplitude TPW events (green line with 1σ error ellipses) with total duration of about 15Myrb, Schematic showing the results of two numerical simulations Credit: (c)Nature 491, 244–248. doi:10.1038/nature11571

(Phys.org)—Researchers using computer simulations and modeling have come up with two possible explanations for the phenomenon known as true polar wandering. The team led by Jessica Creveling of Harvard University, suggest in their paper published in the journal Nature, that dramatic shifts in the Earth's surface over millions of years, and then a return to the previous state, can be explained by bulging at the equator and elasticity of the planets outer shell.

Friday, November 2, 2012

NASA | Atomic Interferometry

An Atom interferometer is an interferometer based on exploiting the wave character of atoms. Interferometers are often used to make high-precision comparisons of distances. This can be used to constrain fundamental constants like the Gravitational Constant or possibly to detect Gravitational Waves.

Featured Post

IBSE about Light Pollution

Here is my presentation that happened in the Discover the Cosmos Conference (Volos, Greece - 2013). The presentation was an Inquiry Base...

Twitter Updates

<- widget2 ->

Popular Posts