Читать книгу Five Questions: Answers to Life's Greatest Mysteries - Philip Benguhe - Страница 4

Where Did We Come From?

Оглавление

The question of creation has stirred the hearts of human beings since prehistoric times. As civilizations and cultures emerged around the globe, each took its turn explaining their own beginnings and that of the world around them. Many begin their tale from darkness and void.

According to the writer Xu Zheng, the ancient Chinese had one such myth–the story of Pangu. A formless chaos gave rise to a cosmic egg that contained two opposing principles, that of Yin and Yang. When these two came into balance, Pangu awoke and began creating the world. He separated the two, with Yin forming the Earth and Yang the heavens above. Each day as Pangu grew, he would push the sky (Yang) higher with the earth (Yin) beneath his feet growing wider. After 18,000 years had elapsed, Pangu, now confident that the earth and sky were firmly fixed, laid to rest. His body became the mountains, his blood the rivers, his breath the wind, his voice the thunder, and his eyes the sun and moon.

Likewise, the Ancient Greeks begin with a world in empty darkness, populated only by the black-winged bird, Nyx. After a time Nyx laid a golden egg, which hatched to reveal Eros, the god of love. The shell of the egg separated into two halves. One part ascended to become the sky and the other became the Earth. Eros named the sky Uranus and the Earth Gaia. Under Eros’ influence, Uranus and Gaia fell in love and produced children. Two of these, the titans Kronos and Rhea, in turn, produced grandchildren of which one son was the famed Zeus. Fearful that his infant children would one day usurp his position, Kronos swallowed them whole. Only Zeus managed to escape this horrible fate and upon growing to manhood tricked Kronos into releasing his brothers and sisters. Once together, they waged war against Kronos. Eventually, Zeus and his siblings triumphed, bringing life to Gaia and populating Uranus with stars. Zeus instructed his son Prometheus to create human beings and his son Epimetheus to create the animals. From this cast of characters, the Ancient Greeks created a mythology of interwoven stories that still captivates readers today.

The Ancient Egyptians of Heliopolis also believed the world began in chaos but a chaos consisting of turbulent waters called Nu. Eventually, the waters receded to expose dry land. Once land appeared, Re-Atum willed himself into being. As he was alone, he mated with his shadow and produced two offspring: Shu, his son, and Tefnut, his daughter. Shu represented the air and life, while Tetnut rain and the principle of order. Originally, Re-Atum was separated from his children in the great chaos. However, once re-united he created the world, with his tears of joy creating humankind. Shu and Tefnut bore Geb, the earth god, and Nut, goddess of the sky. The union of these two, in turn, produced Osiris, Isis, Seth, and Nephthys. Re-Atum took on many names as the sun-god began his ascent across the sky. When he reached mid-day, he became Ra and when he set in the west, Horus.

The followers of the Prophet Abraham placed such importance on explaining creation that they dedicated the first book of their religious text, the Torah, to its answer. Many know this as the first book of the Christian Bible, Genesis. Like the Greek and Chinese myths it describes a duality of creation:

1 In the beginning God created the heaven and the earth.

And just as with the Egyptians, waters represented the initial chaotic void, followed by the emergence of the sun.

2 And the earth was without form, and void; and darkness was upon the face of the deep. And the Spirit of God moved upon the face of the waters.

3 And God said, Let there be light: and there was light.

4 And God saw the light, that it was good: and God divided the light from the darkness.

5 And God called the light Day, and the darkness he called Night. And the evening and the morning were the first day.

Genesis recounts this as all happening on the first day of creation, with the following five days responsible for the creation of land, vegetation, the seasons, the stars, animals, and culminating with man and woman. Finally, the story concludes with:

…and he rested on the seventh day from all his work.

While these stories captivate with their picturesque narrative and perhaps even offer an intuitive inkling as to the actual origin of the universe, they tell us more about the values and beliefs of the myth makers then they offer a factual description of how things actually came to be. However, bit by bit, humanity has gained better insights into the origin of our universe, allowing us to see it as grander and ever more ancient than previously imagined. We now live in a most fortuitous time. In what is most likely the greatest testament to humankind’s curiosity and mental prowess, the scientific community has now managed to formulate and then validate–through a variety of innovative and sometimes quite sophisticated experimentation–enough interlocking theories that can now be assembled like so many jigsaw puzzle pieces, allowing us to finally peer back to the dawn of creation to see the sequential processes that resulted in the genesis of our world and the human species.

While centuries of astronomical observations laid the foundation for our current understanding, the “smoking gun” as to the origin of creation was discovered in 1965. For in that year, Arno Penzias and Robert Wilson, researchers at the then Bell Laboratories, observed a low-level background “noise” signal in their microwave receiver. No matter how many times they cleaned, recalibrated, or repositioned their equipment, the signal persisted. It finally occurred to them that maybe there wasn’t anything wrong with the equipment. Perhaps this ubiquitous signal represented something that was actually there. But what event would cause such a signal throughout the entire universe?

In 1927, Georges Lemaître, a Roman Catholic priest who also happened to be an astronomer and professor of physics, proposed a revolutionary theory to explain the recession of galactic objects (spiral nebulae). Using Einstein’s theory of general relativity, he suggested that the universe did not always exist as it does today but began with a violent explosion from a “primeval atom.” This theory later came to be known as the Big Bang, ironically named by its most prominent skeptic, Dr. Fred Hoyle.

Opponents to the Big Bang, such as Hoyle, dismissed the theory–partly based on its bizarre and far-reaching consequence and partly due to its inherent parallel to religious creation mythology. Hoyle instead proposed an alternate explanation, known as the Steady State theory. It envisioned a universe that always existed and always would exist in more or less the same condition as we observe it today.

In 1948, George Gamow, a Russian physicist and cosmologist, predicted that if the universe did start as Lemaître proposed, some remnant of this initial explosion should still exist today as Cosmic Microwave Background Radiation (CMB). In much the same way as a fireplace retains warmth long after the fire has burned out and the cinders stop glowing, so should the Big Bang leave such a trace to its fiery past.

After some calculation and conference within the scientific community, Penzias and Wilson finally concluded that the radiation detected by their instrumentation was the very same proposed by Gamow–the afterglow from the Big Bang. In addition to earning them the Nobel Prize, their discovery supplied the conclusive proof that Lemaître had it right and not Hoyle. The universe had a definite beginning and a fantastically bizarre one at that.

Modern day astrophysicists, such as Dr. George Smoot of Lawrence Berkeley Laboratories, have further refined this understanding by taking extremely accurate measurements of this background radiation using sophisticated microwave sensors aboard satellites. Measurements by instruments such as COBE (Cosmic Background Explorer) and its successor WMAP (Wilkinson Microwave Anisotropy Probe) have pushed the boundaries of experimental physics and revealed minute variations in the cosmic background radiation. These variations trace back to quantum mechanical fluctuations in the microscopic proto-universe and have exposed the skeletal frame upon which all observable matter clings, allowing scientists to finally piece together the story of the universe in amazing detail.

According to the Big Bang, the universe started out infinitely small and extremely hot. In this realm, our everyday notions based on the classical laws of physics did not yet apply, as these laws had yet to manifest themselves. However, at this earliest of times, it appears that the rules of quantum mechanics still existed intact, directing the operation of the proto-universe.

Quantum mechanics is the branch of physics developed in the early part of the twentieth century that focuses on the characteristics of the extremely small: protons, electrons, and other subatomic particles. Its fundamental postulates are that matter and energy only exist in discrete units and their properties can never be exactly measured but only determined within certain limits. This may seem strange, but quantum mechanics has proven to be one of the most successful theories ever developed and has provided engineers the mathematical tools to design our modern day computers and communication networks. It is more than a bit ironic that something as enormous as the universe would be associated with quantum properties, for the operation of the macroscopic cosmos typically falls into the realm of the classical and relativistic physics discovered by Newton and Einstein.

Scientists have long worked to reconcile relativistic descriptions of the macroscopic cosmos with the quantum mechanical operations of subatomic structures as formulated by the quantum pioneers Niels Bohr, Werner Heisenberg, and Erwin Schrödinger. To this day the struggle persists, with proponents of string theory attempting to bridge the gap. However, it does appear that at the earliest of times quantum mechanical principles remained intact, making it the more fundamental property of the universe.

This offers some interesting possibilities, in that quantum mechanics allows for a reality in which the seemingly impossible can happen–but only within certain limits. According to quantum theory, an object (such as a particle) does not have an exact position or amount of energy. Instead, these values are determined by a probability distribution function. This notion violates our common senses. Some of the more practical among us would undoubtedly try to design an “experiment” that precisely identifies these variables. However, quantum mechanical principles would always frustrate their efforts due to restrictions imposed by the famous uncertainty principle developed by Heisenberg, for it places a fundamental limit on the measurement accuracy of any experiment. In order to precisely know an object’s position (x), we must sacrifice accuracy in the measurement of its momentum (p):

ΔxΔp≥h/2Π, where h is Planck’s Constant (6.62377 X 10-34 joule-sec)

We see a similar limitation with respect to the measurement of a particle’s energy (ΔE) and the time duration used to measure it (Δt):

ΔEΔt≥h/2Π

Since h is such a small number, these uncertainties are quite negligible in our worldly and celestial dealings. However, in the world of the minuscule, they cannot be so easily dismissed. The field of quantum mechanics offers more than just a limit on how accurately we can measure in a small unit of time, for it actually allows particles to take on exaggerated values within that window and behave accordingly. One example of this is the effect known as quantum mechanical tunneling in semiconductors. Here an electron essentially passes “through” a normally insulating material, even though it appears to have insufficient energy to do so. However, quantum principles enable it to take on an exaggerated energy value long enough to “jump” through the gap, provided the material is thin enough (i.e., the transit time through is short enough).

Moreover, this ability to essentially “borrow” energy extends to the degree that particles can appear out of the very vacuum of space. Einstein showed through his famous mass equivalence formula (E = mc2) that energy can be converted to mass and vice versa. Given enough of a fluctuation, sufficient energy is present to actually coalesce into matter. These “virtual” particles a short time later “repay” this energy and disappear again. One would think that this would rarely happen due to the considerable increase in energy required; quite to the contrary, it happens routinely. We can confirm the presence of these virtual particles because while they exist, they produce microscopic perturbations (perhaps, only one part in a billion) in the energy of the atoms or electrons they encounter in their brief lives. In a Nobel Prize winning effort, Willis Lamb specifically measured this perturbed energy state in the hydrogen atom in 1953.

The seeming magic of quantum mechanics does not end there, for it even allows for the creation of matter from a state of zero net energy. The trick involves first splitting the “nothing” into both positive and negative energy. Quantum theory then allows the positive energy to be converted into particles and antiparticles, with the negative energy existing as the gravity between them. In the case of our universe, it can be shown that the negative energy from gravity balances out the positive energy from its mass. Essentially, quantum mechanics allows for a “bubble” of spacetime to spontaneously come into creation.

To some scientists, this would constitute an essentially random element to the universe’s origin. The universe “may” or “may not” have come to be, its existence dictated entirely by chance. Many find this reliance on chance disturbing and diligently seek a definitive mechanism to drive the creation process. While these two positions initially seem at odds, a properly adopted super-cosmological perspective may eliminate the contradiction. Our notion of the limits imposed by “random events” or “a finite probability event” intricately link to our notion of time. As we inhabit a reality so dramatically framed by the ever ticking clock, we make a significant distinction between an event that will directly result from an action (i.e., a high likelihood of occurrence) and one that has only the smallest probability of actualization (i.e., not likely).

Einstein bestowed on us the inseparable link between space and time with his concept of merged spacetime. From this, we can conclude that prior to the Big Bang, since space did not exist–neither did time. While this notion will strike most as quite bizarre, it is nonetheless the case. Under these conditions, just the possibility that an event could occur, no matter how unlikely, might be enough for actualization. The “seed” for the universe would be its mere possibility according to quantum mechanics.

Whatever the causal mechanism, using data from WMAP and other scientific observations, cosmologists have determined that the universe began approximately 14 billion years ago–give or take a few hundred million years. Starting as a singularity smaller than the size of an atom, the universe “cracked” open (metaphorically not too different from the cosmic eggs of ancient myth), and rapidly expanded in spacetime. Our newborn universe consisted of a dense soup of neutrinos and antineutrinos, their antimatter counterpart. These immediately annihilated each other in a tremendous burst of energy. However, due to an ever so slight imbalance (one-in-a-billion), a residual amount of matter survived. While the amount of observable matter seems immense, it accounts for no more than 5% of the universe’s composition. Under the widely accepted double dark theory, astrophysicists believe that another 27% exists as “dark matter” and the vast majority (68%) as “dark energy.” Dark matter is simply matter that does not emit electromagnetic radiation (i.e., light or radio waves) but still exerts gravitational attraction. We can, therefore, measure it by its effect on light passing by. However, the exact nature of dark energy is still to be determined.

Considering the immense age of the universe, an astounding amount of transformation took place within the first few minutes. Modern physics has allowed us to see as far back as 10-43 seconds, known as the Grand Unification Epoch. At this point, all of the forces of nature were unified as one. However, this period would be short-lived. In the smallest fraction of the first second of time (ending at 10-34 sec), the proto-universe experienced a period of rapid inflation–increasing in size more than ten thousand trillion trillion times (1028 X). From this point onward, the universe expanded at a still considerable but much slower rate, as predicted by the Big Bang. This rapid expansion produced a more or less homogeneous universe but one in which the ever so slight quantum fluctuations were now preserved on a grand scale. These fluctuations are primarily responsible for where matter, and hence the galaxies, would form.

Coincident with this expansion, the universe began to cool. This cooling allowed three of the fundamental forces of nature (strong nuclear, weak nuclear and electromagnetic) to decouple from their unified state and direct the assemblage of smaller subatomic particles. After only three minutes, protons, and neutrons could now stably bind to form atomic nuclei.

It would take an additional 300,000 years for the temperature to cool sufficiently for the formation of stable atoms (mainly hydrogen and helium) and a billion years for this early matter to clump together under the force of gravity. Astrophysicists Martin Rees and Joe Silk have proposed a compelling theory that enough mass existed at the center of these gas clouds to form a super-massive black hole. Recent observations confirm that these super-massive black holes do exist at the center of most galaxies. Matter falling into this black hole produced a quasar, the shockwave from which triggered the surrounding gas to coalesce into the stars that populate the galaxies. These early stars converted hydrogen, through the process of nuclear fusion, into the nuclei of the first 26 elements that comprise the periodic table. Some of the larger of these stars ended their lives in a violently explosive death, known as a supernova. In addition to providing a fantastic visual spectacle, supernova explosions both created the heavier elements and redistributed this newly formed matter back into the universe. This material would become the building material for both future planets and life itself.

This drama of star birth and destruction continued until approximately 6-10 billion years ago when a cluster of stars formed at the outer edge of one of the spiral arms of the galaxy we call the Milky Way. Now spread over an arc tens of thousands of light years long, this cluster initially contained thousands of stars all within a diameter of only ten light years across. The larger of these exploded, seeding our general vicinity with an abundance of heavy elements, including a number of radioactive isotopes that would later play a crucial role in the advancement of life on earth.

Around 4.57 billion years ago, one particular accretion of gasses compacted to such a degree that it ignited the nuclear fusion reaction necessary to allow a medium sized star, our sun, to glow bright. The blast from this explosion, accompanied by the prevailing solar wind, swept the lighter elements to the outer reaches of the solar system, leaving only four rocky-cored survivors close in. The third member of this family, the Earth, was to eventually become our beautiful and life-sustaining home.

During its early history, numerous asteroids and comets regularly collided with Earth, keeping the surface a searing bath of molten rock. Around 4.5 billion years ago, it is believed that the Earth experienced a collision of unprecedented magnitude. A companion body roughly the size of Mars, named Theia, hit the Earth off center. It ejected a large portion of Earth’s mantle into space and most probably caused the planet’s axis to tilt to the 23° responsible for Earth’s yearly seasons. The combined ejected material would become our Moon. Other celestial bodies also visited the early Earth, bringing additional bulk material, complex carbon chains, and much-needed water to the planet. It took another one hundred million years or so for things to cool down and rocks to solidify. As the temperature continued to cool, starting around 4.2 billion years ago, the water vapor condensed as rain. This first storm must have continued for a considerable time, hundreds of thousands or possibly even a million years, covering the planet and forming a single great ocean. The turbulent waters of our ancient myths did, in fact, exist in Earth’s early history.

Between 4.1 and 3.7 million years ago, during a period known as the “Late Heavy Bombardment,” a barrage of asteroidal and/or cometary materials invaded the inner solar system–plowing through the Earth’s virgin ocean, vaporizing it, and re-melting much of its crust. One can readily see direct evidence of earth’s turbulent past by observing the numerous, uneroded craters pockmarking the moon.

After a time, the crust once again solidified and earth regained its global ocean. The only land was small basaltic islands, created through volcanism. The Earth appeared quite different in these early days; the sky shone pinkish-orange from abundant quantities of carbon compounds (carbon dioxide, carbon monoxide, and methane) and the seas reddish-brown due to its high iron content. The erosive seas made short work of these soft, volcanic basalt outcrops. The planet would probably look much the same today were it not for particular chemical reactions that were to follow.

In the early 1950s, chemists Stanley Miller and Harold Urey showed that by heating and then passing an electric arc through a mixture of methane, ammonia, and hydrogen, they were able to create a range of organic compounds. These included amino acids, the fundamental building blocks of all proteins and the basis for the rungs on the DNA ladder. Initially, it was believed that these conditions mirrored Earth’s early atmosphere, with photonic energy from the sun driving the process. However, Robert Ballard, the famed oceanographer and discoverer of the Titanic, discovered in the 1970’s complex life forms at depths of nearly 8000 feet below the surface–too far for sunlight to penetrate. Instead, these creatures lived off the nutrients and heat from undersea volcanic vents. Based on the iron sulfur world theory, we have now come to understand that the conditions for the initial generation of life on earth most likely occurred near these hydrothermal undersea vents–instead of in the atmosphere. Furthermore, it is now believed that the various minerals present at these vents acted a catalyst, helping jump-start the life process. More recent discoveries have also found amino acids in meteorites. This would indicate that these base building blocks of life need not have a terrestrial origin and may pervade the cosmos.

Next, through processes only partially understood, these amino acids formed more complex proteins, eventually becoming capable of self-replication. Miraculously, some of these managed to survive the numerous environmental stresses experienced by the early earth and went on to fill the seas. Around 3.5 billion years ago, the first “organized” life forms of bacteria and algae appeared.

Life continued in this limited fashion for another 2.5 billion years. During that time these simple life forms had a profound effect on the planet. Through the process of photosynthesis, early cyanobacteria began extracting carbon from the abundant supply of CO2 in the atmosphere and replacing it with oxygen. This oxygen reacted with the iron-rich oceans, precipitating out iron oxide (rust), and bonded with hydrogen to form additional water vapor. Eventually, the abundance of free oxygen would “poison” the prevalent anaerobic bacteria, causing a dramatic shift to those organisms that could coexist in this new world. Furthermore, super heating of volcanic basalt in the presence of water produces much harder and lighter crystalline rocks, such as granite. This material would both better resist the erosive forces of nature and essentially “float” on the earth’s mantle, forming the basis for our modern continents. Sometimes united into one, sometimes split apart as separate land masses, the land would now finally stand apart from the sea.

The next major breakthrough occurred approximately one billion years ago, when collections of cells began to cooperate and differentiate. This coincided with the formation of the first supercontinent, Rodinia. However, the promising future of planet Earth would next face its greatest challenge, for during the Cryogenian period (850-630 million years ago) the planet would experience a number of devastating ice ages. Some of these were so severe that it is believed glaciers covered much of the oceans and extended down to the equator.

Through a combination of effects the earth eventually warmed, once again returning to the predominantly blue planet we know. The now favorable environmental conditions–coupled with a substantial rise in sea level– provided a wide open niche for those organisms that survived the previous periods of glaciation to flourish. This dramatic proliferation of diverse life forms, named the Cambrian explosion, started between 540 and 500 million years ago. It populated the oceans, giving rise to shelled creatures and then the first vertebrates. One branch of vertebrates evolved into the fish that still fill our modern oceans. One such species, the Coelacanth, even survives to this day, 360 million years since it first appeared on the scene.

While the oceans teemed with creatures large and small, life had not yet ventured onto dry land. Then, between 445 and 365 million years ago, the first land plants appeared. Initially, these were simple algae, found near the water’s edge. Next came sporangia such as Rhynia, a simple plant with a few bifurcated stems each bearing a spherical spore pod. These would give rise to ferns, the first seed-bearing plants. It would take almost an additional 100 million years for the first trees, Archaeopteris, to appear. These looked not unlike Christmas trees and would eventually cover the earth in vast forests.

Vegetation on dry land presented an entirely new ecosystem to host an additional explosion of animal life. As these land-based plants continued to extract carbon from the atmosphere, the oxygen level correspondingly increased. In fact, it would reach levels that far exceeded that of the present day. These elevated levels allowed the first early animal colonists, arachnids and insects, to grow to gigantic proportions. Next, out of the sea came the amphibians, which in turn gave rise to the reptiles. One branch, the dinosaurs, would reign supreme for 150 million years. They dominated every continent, ranging in size from that of a small chicken (Hesperonychus elizabethae) to the giant sauropod, Brachiosaurus (80 feet long and weighing 50-tons). The age of the dinosaurs lasted until 65 million years ago; then, they abruptly vanished. Most scientists now believe that the impact of a large asteroid off the Yucatan peninsula caused their demise. Known as the K-T extinction event (or K-Pg event), the asteroid resulted in worldwide devastation and the loss of 75% of the world’s species. However, not all of the dinosaurs perished. One group, Deinonychus Antirrhopus, sported decorative plumage. This plumage would eventually evolve with their descendant, Archaeopteryx, into something substantially more useful–wings for flight. This distinctive survival advantage allowed them to thrive, leading the way to the many thousands of bird species that fill the skies today.

While the mass extinction ended the dinosaurs’ reign, it cleared the way for another group of animals to develop and prosper, the mammals. These animals differed from their reptilian ancestors in that they were warm-blooded, cared for their offspring, and even internally produced food for their young (i.e., milk). One such group, the primates, showed particular promise. These small, tree-dwelling creatures (akin to modern day lemurs) developed complex social interactions, emotions, and the ability to use sticks as primitive tools.

Recent fossil evidence of the species Darwinius masillae (Ida fossil) indicates that around 47 million years ago a branch of these early primates split off from their ancestors and developed into the anthropoids (now designated by the classification Haplorhini). This group became the great apes, chimpanzees, orangutans, and finally Homo sapiens sapiens (modern humans). The myriad of fossils uncovered in Africa show that Darwin clearly had it right, human beings did evolve from an ancient “monkey.” Many of the finer details are yet to be worked. However, decades of painstaking excavations, accompanied by meticulous bone fragment examinations, and comprehensive physiological modeling have allowed anthropologists for the first time to construct a fairly complete outline of this progression.

Anthropologists now believe that the seven-million-year-old fossil of Sahelanthropustchadensis (known as Toumai) represents the first human ancestor, for it exhibited a continuous brow ridge and short canine teeth–a trait shared only by the genus Homo. However, while one can debate whether or not any particular fossil is a direct link to modern humans or a distant cousin (for it now appears that a number of hominid variations coexisted at times, with overlaps of more advanced and archaic species), the chronological sequence of changes observed in these creatures directly lead to those traits that distinguish humanity from his Haplorhini brethren.

The first major step along this road occurred with just that, a step. The six million-year-old fossil of Orrorin Tugunensis (dubbed Millennium Man presumably due to its initial discovery in the year 2000) indicated that it walked upright. This represented a significant shift from our primate ancestors who resided primarily in the trees and was most likely a response to the changing African environment, from that of dense forest to forest patches surrounded by savannah. Walking upright allowed Millennium Man to use his arms and hands to carry more food further, most likely back to his mate, and would have presented a clear survival advantage. The fossil evidence from later forms, such as the 5.7 million-year-old Ardipithecus kadabba and 4.4 million-year-old Ardipithecus ramidus (nicknamed Ardi), continue to show evolution in the lower frame, resulting in more efficient bipedalism. However, these hominids still spent a considerable amount of time in the forest canopy and from the waist up looked more ape than human.

The next major advance occurred with the genus Australopithecus, which lived between four and two million years ago. From the hundreds of fossils discovered, anthropologists have identified seven distinct species–the most publicized being the fossil nicknamed “Lucy” (from the species Australopithecus Afarensis). Although Australopithecines’ brains were not much larger than those of Ardipithecus or chimpanzees (380-530 vs. 380 vs. 300 cm3), they did show a marked difference in structure from these earlier forms. First, their rounded shape more closely resembles that of a human brain. Second, cranial casts show that the lunate sulcus, a groove in the occipital lobe of the brain of primates, moved further toward the back of the skull. This would imply that the frontal lobe, that portion of the brain responsible for cognitive reasoning, occupied a proportionally larger section. Essentially, Australopithecus brains were being reapportioned toward thinking. Additionally, refinements to the hips and legs allowed these creatures to walk more efficiently and over greater distances. Coincident with this, the size of the hair follicles began to shrink over most of the areas of their body. This process would eventually lead to hominids losing the thick, matted fur coats donned by their ancestors, allowing them to better regulate body heat through the process of sweating. These creatures ably survived the slowly drying African landscape for more than two million years.

Then, around 2.5 million years ago, the environment of the African Rift Valley began to experience dramatic, cyclical swings. For the next 200,000 years, the climate fluctuated between drought and torrential rains. This changing landscape put pressures on all those who called this region home, including the various species of hominids; many would not survive. However, out of this turmoil would arise Homo habilis (skillful man), the first line to be considered true humans.

Analysis of Homo habilis remains shows that their brains continued to grow and their facial features softened. Although still small compared to a modern human, their brains showed about a 50% size increase compared to the Australopithecines. Moreover, a bulge in the section known as Broca’s area indicated that this hominid could have been capable of rudimentary speech. Another seemingly minor change would have a profound impact; a small bone in their thumb considerably broadened, allowing them to grasp objects with precision just as modern humans can. This physiological development relates to the most profound discovery associated with Homo habilis, as excavations sites of their remains are littered with crafted stone tools. These tools allowed them to scrape meat off bones and then crush the bones to extract the marrow. A meat diet provided the necessary protein to allow their brains to continue to grow.

Around 1.5 million years ago, archeologists note the arrival of a new genus, Homo ergaster. Home ergaster possessed a more recognizably human face, with a more rounded skull and projecting nose. They bore a muscular but essentially human build, with short arms and long legs (scientists estimate their height at 6 feet or more) and probably had no more hair than modern humans. Their appearance would have resembled what most people typically think of when they evoke the term “cave man.” Homo ergaster was also the first to have what we would recognize as a human voice and probably expressed themselves with rudimentary speech. With an average brain size of 700-900 cm3 (twice the size of his australopithecine ancestors), they could now contemplate more complex tools and were the first to develop the Acheulean double headed ax. In addition to tool making, this genus possessed another breakthrough technology–fire. Fossil sites show the charcoal remains usually associated with campfires. In addition to producing light and offering protection, fire offered the opportunity to cook one’s food. This produced a more easily digestible meal, further satisfying his brain’s ever increasing need for protein. Anthropologists also believe that the communal settings of a hunt and campfire, coupled with vocal capability set the stage for increased social interaction and cooperation. These traits allowed Home ergaster to survive the perils of life against larger, stronger, and swifter adversaries. Furthermore, this became a forward propelling cycle. A larger brain enabled more complex tool making and hunting strategies, which in turn required more cerebral development to communicate and teach this knowledge to others and the young. Increased mental capacity, in turn, prompted even more sophisticated planning, that required even more cerebral capacity, and so forth. Around 300,000 years ago, after countless millennia of this cycle, brain size had grown to 900 – 1100 cm3, close to that of modern humans.

Home ergaster not only fared well on the African continent but also spread to Europe and Asia, becoming known as Homo erectus (upright man). Over the next million years the eastward lineage would make it as far as China (Peking Man) and Java (Java Man). There is even evidence of populations in Europe. Currently, archeologists seem split on whether or not Homo erectus is a direct line to modern humans or a side branch that eventually died out 300,000 years ago. However, archeologists do agree that modern humans originated from the line that remained in Africa.

Scientists originally grouped the transitional specimens between Homo ergaster/erectus and modern humans into the catch-all category of Archaic Homo sapiens. However, the current consensus designates a new genus, Homo heidelbergensis (emerging around 800,000 years ago), as the bridge. Homo heidelbergensis had the largest brain case of any hominid to date (1100 – 1400 cm3), rivaling that of modern humans. Furthermore, larger frontal and parietal lobes show a continued reorganization of the brain toward abstract reasoning and spatial representation.

Finally, between 500,000 and 300,000 years ago, we see the next major transition, the advent of Homo sapiens (man the wise). Mitochondrial DNA studies coupled with fossil evidence suggest that our subspecies, Homo sapiens sapiens, originated in Africa around 200,000 years ago. In addition to problem solving and abstract reasoning, Homo sapiens had the ability to communicate complex ideas using language and engaged in extensive social interaction. Moreover, Homo sapiens buried their dead and there is evidence of sophisticated mortuary customs. These often included the use of ochre, a red pigment made from hematite. These practices show the beginning of symbolic thought.

By 90,000 years ago, evidence of the line can be found in the Middle East. By 40,000 years ago, Homo sapiens had spread to all parts of the Old World. However, initially, he was not alone.

In Europe, a fellow descendant of a previous migration of Homo heidelbergensis, Homo sapiens neanderthalensis (Neanderthal man), had already established a presence. Modern humans and Neanderthal coexisted until about 10,000 years ago when all traces of Neanderthal disappear from the fossil record. The debate over the cause for their disappearance continues to this day, but it appears most likely that Homo sapiens, equipped with better hunting techniques, a leaner body physique, and a more varied diet, simply out-reproduced his Neanderthal cousin.

Unlike Homo sapiens who dined opportunistically on game, fish, and grains, fossil analysis indicates that Neanderthals consumed exclusively meat. Moreover, the bulky, dense frame of the Neanderthal required more than twice the daily caloric intake of his Homo sapiens cousin. Finally, Homo sapiens had developed more efficient hunting techniques that allowed them to capture more prey, with less effort. These factors allowed for more and better survival of offspring that effectively displaced Neanderthals from the evolutionary landscape. However, a trace of the Neanderthals remains to this day, for recent analyses show that modern Europeans and Asians carry between one and four percent of Neanderthal DNA in their genetic code. What exactly happened to the Neanderthals may forever remain a mystery, but from that point onward as the sole surviving hominid, Homo sapiens sapiens alone would inherit the earth.

When humanity first became self-aware is unclear. Around 50,000 years ago (corresponding to a short temperate period in the midst of the last ice age) migrations arrived in Europe. By this time people had already learned to weave clothes and make extensive use of tools. Moreover, these tools were not mere utilitarian objects but carried elaborate decoration. These Paleolithic tribes knew how to paint, sculpt, make music, and possibly even track the seasons with primitive calendars. They were the first artists. Most strikingly, they performed burial ritual, dressing their dead in elaborately decorated garments. This clearly indicates that the burial ritual was more than just a method of disposing of a body; it must also have held deep meaning. In all human societies of historical record, adornment of the dead strongly indicates a belief in an afterlife. Especially in Neolithic times, objects required scarce raw materials and a considerable time to craft. The participants in the burial ritual would not have disposed of such valuable items in a grave unless they strongly believed that the deceased would benefit from them and that some part of their essence would survive death.

Although initially nomadic, around 10,000 years ago Paleolithic peoples would make two important discoveries that would forever shape the way we lived: agriculture and the domestication of animals. However, humankind’s benefit from agriculture was considerably aided by a most fortunate accident of nature. At the end of the Ice Age, a relatively anemic strain of wild wheat crossbred with a goat grass to form a fertile hybrid, Emmer (triticum dicoccon). Emmer was much plumper than wild wheat, with seeds that naturally scattered in the wind. The cultivated Emmer grain again crossbred with another natural goat grass to produce a yet larger fertile hybrid, what we call bread wheat (triticum aestivum). Bread wheat produces even plumper, more densely packed seed kernels and therefore generates significantly higher nutritional content in a small land area compared to other types of grains. However, due to their large size, these kernels cannot propagate naturally and rely entirely on human cultivation. Humanity and bread wheat were now inexorably linked together through agriculture.

Locations situated near rivers would allow for the irrigation of crops and provide water for the livestock. These small settlements would become the blueprint for our modern cities and the urban lifestyle we know today. By 6000 BCE, civilizations could be found in Mesopotamia, Egypt’s Nile Valley, and China’s Yellow River Valley. In addition to providing protection and centers for trade, cities could now preserve the cultural beliefs of their inhabitants and with the invention of writing, preserve these for future generations. From these writings, we came to see how each society answered the major questions of religion: the creation of the world, what follows death, and the nature of God.

After 10,000 years of searching, we finally now have a new and more complete story of creation, a story that spans nearly 14 billion years and encompasses the breadth of human knowledge. However, can that knowledge now also offer insight into the second question? Is there an ultimate creator to the universe and what is its nature?

Five Questions: Answers to Life's Greatest Mysteries

Подняться наверх