I visited virtual Hawaii from a hotel in Times Square

I visited virtual Hawaii from a hotel in Times Square

By Adi Robertson

What does a ‘Total Recall’ vacation look like in 2014?

If virtual reality is as successful as its proponents hope, hotel chain Marriott is playing a dangerous game. As of earlier this month, it’s touting what it calls the “first-ever virtual travel experience.” Without ever leaving your home city, you can visit tourist hotspots thousands of miles away. You can have a beach all to yourself or stand at the very top of a tower without fear of falling. In other words, Marriott has decided to dip its toes into virtual reality with a demo that seems to defeat the entire purpose of hotels.

The day I step into its Times Square branch, that feels like a good thing, because it’s a terrible time to be stuck in a real-world hotel. Compared to the crystal-clear late-summer sky, the artificial lights inside are dim and yellow. They throw shadows around the hotel’s aggressively patterned carpet, its confusing strings of escalators and staircases, and the octagonal black-and-white pod that visitors are lining up to get inside — the one surrounded by black bands with the hashtag #GetTeleported. “Are you ready to get teleported?” the attendant asks. I say yes and sign one of the waivers that are gradually becoming standard procedure among businesses using VR, absolving Marriott of liability for dizziness, nausea, and any other possible side effects. Then she escorts me into the pod and fits me with a Rift headset and earphones, and I prepare for takeoff.

For a total of maybe five minutes, the travel pod takes you from this stultifying reality to a Hawaiian beach and London’s Tower 42. More specifically, it sends you to a classy virtual hotel lobby, where you fall through portals into London and Hawaii — even in VR, apparently, you need to book accommodations. While you could theoretically boot up the demo on any computer, Marriott’s pod is special because it’s “4D,” incorporating fans, synthetic scents, and a platform that tilts and rolls to give you a sense of real motion. That’s actually a big difference. You could play a beach simulator on any Oculus Rift, but the 4D pod can separate you from reality in a way that a headset alone can’t. In Hawaii, a warm, vaguely tropical smell hides the ordinary hotel scent. In London, cold wind whips your hair back. When you fall through the portal, your body actually shifts forward. It’s all obviously artificial, but it’s close enough to effectively suggest a change of scenery.

“Your brain is just a collection of signals that it gets from your sensory organs.”

Imaginary vacations were a fixture of pop culture long before the virtual reality boom. Ian Cleary, a VP for Relevent — which helped develop both the travel pod and the Game of Thrones Oculus Rift experience — says this project was inspired partly by Total Recall. “Your brain is just a collection of signals that it gets from your sensory organs. Whether those inputs are real or fake, at the end of the day it kind of doesn’t matter. If we can replicate them faithfully enough through these mechanisms, your brain believes that you went to these places and did these things.”

It sounds great, right? Step into a booth and enjoy the ocean spray on an Indonesian beach or the view from the top of the Burj Khalifa. Take a quick trip instead of having to arrange a leave of absence at work. Feel like you’ve genuinely been somewhere you could never afford to visit. Be more comfortable saving your money instead of springing for that trip to the Bahamas…

Wait, says Marriott. Not so fast.

“People have such a love for travel, there’s really no replacing that cultural experience of being in the space, meeting the people, experiencing the food,” says Marriott marketing VP Michael Dail. Cleary points out that current technology still struggles to render, say, human figures that won’t break the illusion of presence. “The technology may improve to the point someday where it can totally surpass that real experience,” he says. “I don’t think we’re quite there yet.” And, indeed, they’re not.

Marriott Teleporter

If you want to imagine VR that can replace actual travel, there are hints of it in Marriott’s synthetic world — at the very least, it provides a calming kind of sensory deprivation, a temporary escape from whatever you’ve been doing. The single existing “teleporter” is making an eight-city tour of the US until the end of the year. If VR entertainment takes off, the company could offer headsets in its rooms. If Marriott decides to keep working on virtual vacations, Dail says future experiments could take visitors to Costa Rican forests or Mt. Kilimanjaro — “people might say well, I don’t have the means or the funds to go there, but I would love to experience it through virtual travel.” He suggests that travelers could take pictures of their experience and post them on Facebook or Instagram to “brag a little,” although it’s hard to imagine a headset vacation provoking admiration and envy.

Either way, in this rudimentary state, the teleporter doesn’t so much sate your desire to go somewhere as whet your appetite. Which is, of course, just what a hotel company would want: virtual free samples for the travel industry. “I don’t ever see us getting” to purely virtual vacations, says Mike Wood, whose visual effects company Framestore also worked on the Game of Thrones project and Marriott’s teleporter. “But I see it as a way of showing someone just how breathtaking that place in Tibet or Hawaii might be, and [it’s] gonna make you think, ‘Wow, I really want to go and see that.'”

Virtual free samples for the travel industry

Because Marriott still needs physical guests, the creators of its “teleporter” are unusually likely to temper their praise for VR. Cleary says that “literally and figuratively, the sky is the limit,” but he’s still quick to point out how far we are from a perfect experience. If the project ever becomes more than a marketing stunt, the team will have to tailor it to whatever devices people actually use for virtual reality — assuming it catches on at all. This hint of pragmatism leaves room for discussing interesting present-day technology. The pod’s 4D elements, for example, were developed by sending people to stand at the site with recorders, describing everything they felt so it could be replicated later. It’s a slightly low-tech supplement to the 360-degree camera’s cold and objective gaze, and the kind of thing that’s overlooked if we focus only on an ultimate, perfect future version of VR.

Instead of the all-encompassing alternate reality of Ready Player One or Snow Crash, present-day VR is increasingly showing up as a supplementary experience, whether it’s in movie special features or product tie-ins (even Mountain Dew has an Oculus Rift promotion). Virtual travel can’t compare to a few days walking through a foreign city, eating at restaurants and meeting new people. But today’s two minutes in Hawaii could be 2020’s whirlwind trip to see the Mona Lisa or the Leaning Tower of Pisa. If you’re going to be a tourist anyways, is it so much worse to do it virtually? It’s strange to see one of the most ambitious test cases of virtual reality — an idea that’s been predicted and speculated upon for decades — filtered through this limited, larval version. In some sense, it feels like we’re seeing the cracks in the dream before it’s even really arrived. At the same time, though, it’s becoming clearer what our foreseeable future could actually hold. Technology might not be advanced enough to sell people on Pisa right now, but by 2016, I’m at least looking forward to a perfectly immersive tour of the Louvre’s broom closets.

Evolution: Why don’t we have hairier faces?

Evolution: Why don’t we have hairier faces?

By Jason G Goldman

Men may have beards, but compared to many of our primate cousins our faces are pretty hairless – why is that? Jason G Goldman explores the enigma of our bare-faced cheeks.

Have you ever stopped to consider your face? Compared to most of the rest of the animal kingdom, the human face has at least one really peculiar feature: it’s almost completely devoid of hair. Sure, some people grow beards or mustaches, but even a full pirate’s beard would leave quite a bit of skin showing. They don’t call us “the hairless ape” for nothing. How did we come to be so bare-faced?

The reasons we lost our body hair are still debated. Some researchers think that we lost our fur to free ourselves of parasites, such as lice. This might have made us more attractive to the opposite sex; bare skin would advertise our lack of parasites. Others have suggested that we lost most of our hair to facilitate cooling as we moved from the shady forests to the hot savannah. Still others wonder whether nakedness is one of a number of juvenile traits that humans retain throughout their lifetimes; humans are thought to be, in one sense, juvenilized apes, who mature slower and live longer than our ape cousins.

Yet Mark Changizi, a neurologists at 2AI Labs in Boston, has an intriguing alternative explanation for why we don’t have hair on our faces, at least when we compare ourselves to other primates. It’s because we’re walking, talking, breathing ‘mood rings’.

Color reveal

Mood rings were a short-lived fad of the mid 1970s. The idea behind wearing one was that it would act as a sort of emotional barometer, betraying your innermost feelings to anyone who took a glance at the jewellery. In truth, mood rings were little more than thermometers, designed to change color according to body temperature. But the idea – that color betrays emotion – isn’t actually all that farfetched. The idea is pervasive in human culture. We become “green with envy”. We turn red when angry or embarrassed. Sadness is referred to as “the blues”. The truth may not be so far off.

Men might have beards, but compared to other mammals we're pretty bare-faced (Thinkstock)

Men might have beards, but compared
to other mammals we’re pretty bare-faced

Changizi, together with researchers Qiong Zhang and Shinsuke Shimojo, argues that our faces evolved their hairlessness to allow other members of our species the ability to read our emotions. Indeed, primate faces and – in some cases – rumps and genitalia change colors thanks to the underlying physiology of the skin. “For highly social animals like most primates, one of the most important kinds of objects to be competent at perceiving and discriminating is other members of one’s own species,” he writes.

Most mammals, like dogs, horses, or bears, can only see blends of two colors when they look at the world. They’re called dichromats, and they can only see yellows, blues, or the greens that form when blue and yellow mix – in addition to perceiving brightness. That’s because their eyes have, in addition to the brightness-sensing rods, only two types of cones: ones sensitive to short or long wavelengths of light. But humans and some other primates are trichromats. We have a third type of cone, sensitive to medium wavelengths of light. Having that third cone means that we can also detect colors along a red-green continuum. (A few lucky people are tetrachromats, which means they can see even more shades).

Blood flow

But there’s something odd about the three types of cones that trichromats have: they’re not evenly spaced out. It just so happens that this odd assortment of cones allows our eyes to perceive properties of the blood circling through our bodies just beneath the skin: how saturated with oxygen the haemoglobin is, and how concentrated the haemoglobin is. Changes in those two variables result in predictable changes in skin colouration.

It’s actually quite remarkable how many colors human skin can be. We’re used to thinking of skin as being white, black, or brown. But those are just a baseline. Darker faces still blush, after all, something that even Darwin noticed. Skin appears redder as haemoglobin becomes more and more oxygenated. Reductions in oxygen saturation turn the skin green (which is just how veins carrying de-oxygenated blood back to the heart look). On the other hand, a greater accumulation of blood in a certain area turns the skin somewhat blue. Just like a bruise. And a reduction in blood concentration does the opposite, making the skin appear yellow. Which, indeed, is how we describe somebody who isn’t looking very healthy. “Color is sufficiently suggestive of emotion,” writes Changizi, “that cartoons often use color on a face to indicate emotional state.” And it’s something that even children can recognize effortlessly.

Other primates with bare faces tend to have vision like ours

Dichromats – the dogs, bunnies, and bears of the world, along with certain primates – can only perceive changes along the blue-yellow dimension, allowing them to notice changes only in blood concentration. In some ways, that’s a useful skill. It’s probably best to avoid someone who looks yellow, to avoid becoming infected with whatever disease or sickness they’re trying to fight off, or to escape being near someone who might imminently vomit. On the other hand, a bruise betrays injury, someone who will be easier to beat in a fight over resources.

Here’s the best part: if it’s true that facial bare skin evolved to allow for color signalling, then other trichromatic primates should have visible skin too. To see if his hypothesis held up, Changizi looked at 97 different primate species. Sure enough, he found that monochromatic and dichromatic primates are furry, while those, including us, who have three types of cones have more visible facial skin.

Even if bare skin was initially selected for something other than color signalling, it is likely the case that both color vision and visible skin became evolutionarily entangled, at least among primates, argues Changizi.

So next time your friends ask you what you’re feeling, just tell them to look at your skin. They should be able to figure it out: after all, they have millions of years of evolution on their side.

Where do witches come from?

Where do witches come from?

by Alastair Sooke

Images of alluring young witches and hideous hags have been around for centuries – but what do they mean? Alastair Sooke investigates.

Ask any Western child to draw a witch, and the chances are that he or she will come up with something familiar: most likely a hook-nosed hag wearing a pointy hat, riding a broomstick or stirring a cauldron. But where did this image come from? The answer is more arresting and complex than you might think, as I discovered last week when I visited Witches and Wicked Bodies, a new exhibition at the British Museum in London that explores the iconography of witchcraft.

Witches have a long and elaborate history. Their forerunners appear in the Bible, in the story of King Saul consulting the so-called Witch of Endor. They also crop up in the classical era in the form of winged harpies and screech-owl-like “strixes” – frightening flying creatures that fed on the flesh of babies.

Circe, the enchantress from Greek mythology, was a sort of witch, able to transform her enemies into swine. So was her niece Medea. The ancient world, then, was responsible for establishing a number of tropes that later centuries would come to associate with witches.

The Three Weird Sisters from Macbeth,
1785 (The Trustees of the British Museum)

Yet it wasn’t until the early Renaissance that our modern perception of the witch was truly formed. And one man of the period arguably did more than any other to define the way that we still imagine witches today: the German painter and printmaker Albrecht Dürer.

Double trouble

In a pair of hugely influential engravings, Dürer determined what would become the dual stereotype of a witch’s appearance. On the one hand, as in The Four Witches (1497), she could be young, nubile and lissom – her physical charms capable of enthralling men. On the other, as in Witch Riding Backwards on a Goat (c 1500), she could be old and hideous.

Durer’s influential etchings
portrayed witches as young
and nubile or old crones
(Albrecht Dürer)

The latter print presents a naked crone sitting on top of a horned goat, a symbol of the devil. She has withered, drooping dugs for breasts, her mouth is open as she shrieks spells and imprecations, and her wild, wind-blasted hair streams unnaturally in the direction of her travel (a sign of her magical powers). She is even clutching a broomstick. Here is the matriarch of the witches that we find in popular culture today.

For art historians, though, the interesting question is what provided Renaissance artists with the model for this appalling vision. One theory is that Dürer and his contemporaries were inspired by the personification of Envy as conceived by the Italian artist Andrea Mantegna (c 1431-1506) in his engraving Battle of the Sea Gods.

“Mantegna’s figure of Envy formed a kind of call for the Renaissance of the witch as a hideous old hag,” explains the artist and writer Deanna Petherbridge, who has co-curated the exhibition at the British Museum. “Envy was emaciated, her breasts were no longer good, which is why she was jealous of women, and she attacked babies and ate them. She often had snakes for hair.”

Another of Durer’s etchings
shows a witch riding backwards
on a goat, with four putti
(The Trustees of the British Museum)

A good example of this Envy-type of witch can be seen in an extraordinarily intense Italian print known as Lo Stregozzo (The Witch’s Procession) (c 1520). Here, a malevolent witch with open mouth, hair in turmoil and desiccated dugs clutches a steaming pot (or cauldron), and rides a fantastical, monstrous skeleton. Her right hand reaches for the head of a baby from the heap of infants at her feet.

This print was produced during the ‘golden age’ of witchcraft imagery: the tumultuous 16th and 17th centuries, when vicious witch trials convulsed Europe (the peak of the witch-hunts lasted from 1550 to 1630). “Across Europe, there was the Reformation and Counter-Reformation, the Thirty Years’ War, fantastic poverty and social change,” says Petherbridge. “Even King James in his text Daemonologie [1597] was asking: why was there such a proliferation of witches? Everybody assumed it was because the world had got so foul that it was coming to an end.”

The 3 Witches HQ Wallpaper

As a result there was an outpouring of brutally misogynistic witchcraft imagery, with artists taking advantage of the invention of the printing press to disseminate material rapidly and widely. “Witchcraft is closely allied to the print revolution,” Petherbridge explains. Many of these prints, such as the powerful colour woodcut Witches’ Sabbath (1510) by Dürer’s pupil Hans Baldung Grien, can be seen in the British Museum’s exhibition.

By the 18th Century, though, witches were no longer considered a threat. Instead they were understood as the superstitious imaginings of peasants. Still, that didn’t stop great artists such as Goya from depicting them.

Los Caprichos is Goya’s collection of 80 etchings from
1799 that use witches as vehicles for satire (Goya

Los Caprichos, Goya’s collection of 80 capricious (or whimsical) etchings from 1799, uses witches as well as goblins, demons and monsters as vehicles for satire. “Goya uses witchcraft metaphorically to point out the evils of society,” says Petherbridge. “His prints are actually about social things: greed, war, the corruption of the clergy.”

Broom with a view

Goya did not believe in the literal reality of witches, but his prints are still among the most potent images of witchcraft ever made. Plate 68 of Los Caprichos is especially memorable: a wizened hag teaches an attractive younger witch how to fly a broomstick. Both are naked, and the print was surely meant to be salacious: the Spanish ‘volar’ (to fly) is slang for having an orgasm.

Around the same time, there was a vogue among artists working in England for depicting theatrical scenes of witchcraft. The Swiss-born artist Henry Fuseli, for instance, made several versions of the famous moment when Macbeth meets the three witches for the first time on the heath.

By now, though, the art of witchcraft was in decline. It lacked the strange imaginative force that had animated the genre in earlier centuries. In the 19th Century, the Pre-Raphaelites and the Symbolists were both drawn to the figure of the witch, whom they recast as a femme fatale. But their sinister seductresses arguably belong more to the realm of sexual fantasy than high art.

The one constant throughout the history of the art of witchcraft is misogyny. As a woman, how does this make Petherbridge feel? “At the beginning when I was looking at these images, I was quite distressed because they are so ageist,” she says. “Of course, now I’ve stopped being shocked by them, and I think that they are saved by their excess, satire and invention. Artists were often drawn to these scenes because they offered drama. They were free to spread their wings and come up with all kinds of bizarre imagery. Yes, these scenes represent the demonisation of women. But often they are keenly linked to social critique. Witches are the scapegoats on which the evil of society is projected.”

The Mysterious Case of the 113-Year-Old Light Bulb

The Mysterious Case of the 113-Year-Old Light Bulb

by Zachary Crockett

In the United States, the average incandescent light bulb (that is, a bulb heated with a wire filament) has a lifespan of about 1,000 to 2,000 hours. Light emitting diode (LED) bulbs, which are increasingly replacing incandescent bulbs, are said to last between 25,000 and 50,000 hours — an incredible bump.

The Centennial Light, a bulb that’s reportedly been burning for 113 years

But dangling from the ceiling of a California firehouse is a bulb that’s burned for 989,000 hours — nearly 113 years. Since its first installation in 1901, it has rarely been turned off, has outlived every firefighter from the era, and has been proclaimed the “Eternal Light” by General Electric experts and physicists around the world.

Tracing the origins of the bulb — known as the Centennial Light — raises questions as to whether it is a miracle of physics, or a sign that new bulbs are weaker. Its longevity still remains a mystery.

A Brief History of the Light Bulb

While it is generally reported that Thomas Edison “invented” the light bulb in 1879, a long lineage of innovators preceded him.

In 1802, British chemist Humphry Davy produced incandescent light by passing current through thin strips of platinum; over the next 75 years, his experiments would be the basis of many efforts to produce long-lasting, bright light through heated filaments. Scottish inventor James Bowman Lindsay boasted, in 1835, of his new light that allowed him to “read a book at a distance of one and half feet,” but soon after abandoned his efforts to focus on wireless telegraphy. Five years later, a team of British scientists toyed with heating platinum filaments inside a vacuum tube. Though the high price of platinum made the device inaccessible and difficult to scale, this design formed the basis for the first incandescent lamp patent in 1841.

With his integration of carbon filaments in 1845, American inventor John W. Starr arguably could have been credited as the light bulb’s inventor, but he died of tuberculosis the following year, and his colleagues were unable to pursue the idea without his knowledge and expertise. A few years later, British physicist Joseph Swan utilized Starr’s advancements to produce a working bulb, and, in 1878, became the first man in the world to brighten his home with bulbs.

Meanwhile, in America, Thomas Edison worked on improving carbon filaments. By 1880, through the utilization of a higher vacuum and the development of an entire integrated system of electric lighting, he improved his bulb’s life to 1,200 hours and began producing the invention at a rate of 130,000 bulbs per year.

In the midst of this innovation, the man who’d build the world’s longest-lasting light bulb was born.

The Shelby Electric Company

Adolphe Chaillet, c. 1890

Adolphe Chaillet was bred to make exceptional light bulbs. Born in 1867, Chaillet was constantly exposed to the burgeoning light industry in Paris, France. By age 11, he began accompanying his father, a Swedish immigrant and owner of a small light bulb company, to work. He learned quickly, garnered an interest in physics, and went on to graduate from both German and French science academies. In 1896, after spending some time designing filaments at a large German energy company, Chaillet moved to the United States.

Chaillet briefly worked for General Electric, then, riding on his prestige as a genius electrician, secured $100,000 (about $2.75 million in 2014 dollars) from investors and opened his own light bulb factory, Shelby Electric Company. While his advancements in filament technology were well-known, Chaillet still had to prove to the American public that his bulbs were the brightest and longest-lasting. In a risky maneuver, he staged a “forced life” test before the public: The leading light bulbs on the market were placed side-by-side with his, and burned at a gradually increased voltage. An 1897 volume of Western Electrician recounts what happened next:

“Lamp after lamp of various makes burned out and exploded until the laboratory was lighted alone by the Shelby lamp — not one of the Shelby lamps having been visibly injured by the extreme severity of this conclusive test.”

Chaillet’s original patent

The bulb’s brilliance was attributed to Chaillet’s patented coiled carbon filament, as recounted in a The Electrical Review (1902):

“The inventor’s idea, practically stated, is to flatten the coil, and also flatten the end of the globe or bulb so that the greatest intensity of light is thrown downwardly. The filament is coiled in a form which presents a loop that is elongated transversely of the axis of the lamp, or in other words, the loops are substantially elliptical, the major axis being transverse to the longitudinal axis of the lamp. The globe is likewise flattened at its tip end so that the glass wall is substantially parallel with the lower lines of the filament loops when the lamp is suspended from above.”

Citing these advancements, Shelby claimed that its bulbs lasted 30% longer and burned 20% brighter than any other lamp in the world. The company experienced explosive success: According to Western Electrician, they’d “received so many orders by the first of March [1897], that it was necessary to begin running nights and to increase the size of the factory.” By the end of the year, output doubled from 2,000 to 4,000 lamps per day, and “the difference in favor of Shelby lamps was so apparent that no doubt was left in the minds of even the most skeptical.”

Over the next decade, Shelby continued to roll out new products, but as the light bulb market expanded and new technologies emerged (tungsten filaments), the company found itself unable to make the massive monetary investment required to compete. In 1914, they were bought out by General Electric and Shelby bulbs were discontinued.

The Centennial Light

Seventy-five years later, in 1972, a fire marshall in Livermore, California informed a local paper of an oddity: A naked, Shelby light bulb hanging from the ceiling of his station had been burning continuously for decades. The bulb had long been a legend in the firehouse, but nobody knew for certain how long it had been burning, or where it came from. Mike Dunstan, a young reporter with the Tri-Valley Herald, began to investigate — and what he found was truly spectacular.

Tracing the bulb’s origins through dozens of oral narratives and written histories, Dunstan determined it had been purchased by Dennis Bernal of the Livermore Power and Water Co. (the city’s first power company) sometime in the late 1890s, then donated to the city’s fire department in 1901, when Bernal sold the company. As only 3% of American homes were lit by electricity at the time, the Shelby bulb was a hot commodity.

In its early life, the bulb, known as the “Centennial Light,” was moved around several times: It hung in a hose cart for a few months, then, after a brief stint in a garage and City Hall, it was secured at Livermore’s fire station. “It was left on 24 hours-a-day to break up the darkness so the volunteers could find their way,” then-Fire Chief Jack Baird told Dunstan. “It’s part of another era in the city’s past [and] it’s served its purpose well.”

Though Baird acknowledged that it had once been turned off for “about a week when President Roosevelt’s WPA people remodeled the fire house back in the 30s,” Guinness World Records confirmed that the hand-blown 30-watt bulb, at 71 years old, was “the oldest burning bulb in the world.” A slew of press followed, which saw it featured in Ripley’s Believe it or Not, and major news networks.


Aside from the 1930s fire house remodel, the bulb has only lost power a few times — most notably in 1976, when it was moved to Livermore’s new Station #6. Accompanied by a “full police and fire truck escort,” the bulb arrived with a large crowd eager to see it regain power, but, as recalled by Deputy Fire Chief Tom Brandall, “there was a little scare:”

“We got to new location and the city electrician installed the light bulb and made connection. It took about 22-23 min, and [the bulb] didn’t come back on. The crowd gasped. The city electrician grabbed the switch and jiggled it; it went on!.”

Once settled, the bulb was placed under video surveillance to ensure it was alive at all hours; in subsequent years, a live “BulbCam” was put online. Last year, the bulb’s groupies (of which there are nearly 9,000 on Facebook), received another scare when it lost light:

The Centennial Light’s Facebook page

At first it was suspected that the light had finally met its demise, but after nine and half hours, it was discovered that the bulb’s uninterrupted power supply had failed; once the power supply was bypassed, the bulb’s light returned. The 113-year-old bulb had outlived its power supply — just as it had outlived three surveillance cameras.

Today, the bulb still shines, though, as one retired fire volunteer once said, “it don’t give much light” (only about 4 watts). Owning a frail piece of history comes with great responsibility: Livermore firefighters treat the little bulb like a porcelain doll. “Nobody wants that darn bulb to go out on their watch,” once said former fire chief Stewart Gary. “If that thing goes out while I’m still chief it will be a career’s worth of bad luck.”

“They Don’t Make ‘Em Like They Used To”

Everyone from Mythbusters to NPR has speculated on the reasons for the Shelby bulb’s longevity. The answer, in short, is that it remains a mystery — Chaillet’s patent left much of his process unexplained.

Some, like UC Berkeley electrical engineering professor David Tse, outright dispel the bulb’s legitimacy. “It’s not possible,” he told the Chronicle in 2011. “It’s a prank.” Others, like engineering student Henry Slonsky, insist it’s likely due to the fact that things were once made with more care. “Back then,” he says, “they made everything way over the top.”

In 2007, Annapolis physics professor Debora M. Katz purchased an old Shelby bulb of the same vintage and make as the Centennial Light and conducted a series of experiments on it to determine its differentiation from modern bulbs. She reported her findings:

“I found the width of the filament. I compared it to the width of a modern bulb’s filament. It turns out that a modern bulb’s filament is a coil, of about 0.08 mm diameter, made up of a coiled wire about 0.01 mm thick. I didn’t know that until I looked under a microscope. The width of the Shelby bulb’s 100-year-old filament is about the same as the width of the coiled modern bulb’s filament, 0.08 mm.”

While Katz’s findings were inconclusive, she speculates that the Shelby bulb’s filament — eight times thicker than that of a modern bulb — may be integral to its longevity. Modern bulbs, she explains, use thinner tungsten filaments that put out more light (40 to 200 watts) burn hotter, and are therefore taxed more rigorously than older bulbs like the Shelby. “You can think of it as sort of an animal with a low metabolism,” she reported to the Centennial Light’s committee. “It’s giving us less energy per time, so it can keep on going longer.” Katz also adds that the bulb’s age could be, in part, contributed to the fact that it hasn’t been turned off and on a whole lot — a process which is more exhausting on a bulb than letting it run continuously (the filament needs to reheat itself, much like a car’s engine).

The Shelby bulb’s properties, from Felgar’s Paper

Justin Felgar, one of Katz’s students, explored the bulb further and published his findings in the 2010 paper, “The Centennial Light Filament.” Felgar found that the hotter the Shelby got the more electricity went through it — the opposite of what happens to modern tungsten filaments. To determine the Shelby filament’s exact makeup, Felgar asserts that it would be necessary to “tear one up” and run in through the Naval Academy’s particle accelerator — but it’s a costly process and has yet to be undertaken.

Ultimately, Katz and her colleagues remain uncertain. “I thought for sure all the physics must’ve been worked out,” she says, “But perhaps there’s just some fluke with that particular [bulb].” Livermore’s ex-deputy fire chief agrees. “The reality is its probably just a freak of nature,” he told NPR in 2003, “just one in a million light bulbs thats just going to keep going and going.”

The Lightbulb Cartel

Today, the average incandescent bulb lasts about 1,500 hours; even top-of-the-line LED bulbs, at $25 each, last 30,000 hours. Regardless of the Centennial Bulb’s secret formula, it has burned for 113 years — nearly 1 million hours. So where did we go wrong with light bulb technology?

Light bulb companies like Shelby once prided themselves on longevity — so much so, that the durability of their products was the central focus of marketing campaigns. But by the mid-1920s, business attitudes began to shift, and a new rhetoric prevailed: “A product that refuses to wear out is a tragedy of business.” This line of thought, termed “planned obsolescence,” endorsed intentionally shortening a product’s lifespan to entice swifter replacement.

In 1921, multinational lighting manufacturer Osram formed the “Internationale Glühlampen Preisvereinigung” (International Association of Light Bulb Prices) to regulate prices and limit competition. General Electric soon reacted by founding the “International General Electric Company” in Paris. Together, these organizations traded patents and sales information to get a stronghold on the light bulb market.

In 1924, Osram, Philips, General Electric, and other major electric companies met and formed the Phoebus Cartel under the public guise that they were cooperating to standardize light bulbs. Instead, they purportedly began to engage in planned obsolescence. To achieve this the companies agreed to limit the life expectancy of light bulbs at 1,000 hours — less than Edison’s bulbs had achieved (1,200 hours) decades before; any company that produced a bulb exceeding 1,000 hours in life would be fined.

Until disbanding during World War II, the cartel supposedly halted research, preventing the advancement of the longer-lasting light bulb for nearly twenty years.


Whether or not planned obsolescence is still on the agenda of light bulb manufacturers today is highly debatable, and there exists no definitive proof. In any case, incandescent bulbs are being phased out worldwide: Since Brazil and Venezuela began the trend in 2005, many countries have followed suit (European Union, Switzerland, and Australia in 2009; Argentina and Russia in 2012; the United States, Canada, Mexico, Malaysia, and South Korea in 2014).

As more efficient technologies have surfaced (halogen, LED, compact fluorescent lights, magnetic induction lights), the old filament-based bulbs have become a relic of the past. But perched up in the white ceiling of Livermore’s Station #6, the granddaddy of old-school bulbs is as relevant as ever — and refuses to bite the dust.

The Lessons of Lost

The Lessons of ‘Lost’: Understanding the Most
Important Network Show of the Past 10 Years

by Andy Greenwald

Lost premiered 10 years ago this week. It ended
four and a half years ago. And I still miss it like crazy.

This doesn’t seem to be a popular opinion these days, as backlash to the underwhelming series finale seems to have overtaken the memory of everything that came before. But backlash has overtaken just about everything in 2014, and the effect on television has been profound. Networks today are more hidebound, creators more skittish, and broadcast TV as a whole more distressingly safe. Yes, the expansion of viewing options and the subsequent diminishing of audiences have played the leading roles in this retrenchment, but it’s impossible to discount the added impact of our rapid-response culture. The big four networks were never particularly good at taking chances, but there was a dedicated, if occasionally naive, desire to give people something they might adore. Now networks twist themselves into knots coming up with milquetoast servings of what they’re pretty sure people won’t hate. It’s how you end up with 66 annual hours about Naval crime investigation and a half-dozen sitcoms about love that, taken together, are barely worthy of being liked. Want to know what white bread dipped in milk tastes like? I’m pretty sure there’s a new network show just for you.

Reconsidering Lost1 after a decade ought to feel like an exercise in gauzy nostalgia. But it doesn’t. In fact, it’s the opposite. Rewatching the premiere the other night, I was floored by how exceptional it is, especially in comparison to the middling dreck I was sifting through just a week ago. The fearlessness of those first two hours, directed by J.J. Abrams at a reported cost of $14 million, is intoxicating. It’s instantly more vibrant, more alive, more modern than anything else currently on broadcast air. From its opening moments, in which Jack opens his eyes to a strange new reality, Lost makes no effort to coddle, to entice. Rather, it intentionally disorients. When Jack runs through the jungle — always toward the danger rather than away from it — he dares the audience to sprint right alongside him. Though the show was years away from frustrating anyone with its mysteries, this headlong dash into the unknown was prophetic. Nothing would stop Lost from being reckless — not with story, not with ideas, not with time, and certainly not with emotion. Taking the risk, and taking the journey, was what made the show worthwhile, regardless of where things ended up. To paraphrase a great man, sometimes you eat the polar bear, and sometimes the polar bear eats you.

What I remember most about the night Lost premiered was the sheer magnitude of the spectacle: a man sucked into a dying 747 turbine, an explosion of smoke and fire, the (unseen!) smoke monster smashing Greg Grunberg like a grape. But what lingers now is the attention and care given to the quieter scenes: Jack gazing into the impossibly blue Pacific just before the screaming of his fellow survivors reaches his ears; Charlie karaokeing his own hit song; a petrified Kate counting to five. The brilliance of the series lay in these moments, the ellipses between the exclamation marks. Even as entire palm trees gave way to what sounded like a Godzilla-size taxi meter, the most intriguing aspects of Lost were right in front of us. Where did all these survivors come from? What was the deal with the angry Korean couple, the gentle giant, the creep with the citrus smile? I didn’t just want to know where they were. I wanted to know who they were. It’s a seemingly simple distinction, but it’s one that TV producers have been getting wrong with staggering consistency ever since.

Don’t misunderstand: This is not meant as an apologia for Lost’s ending. I still seethe over the shoddy disposal of core characters like Sun, Jin, and Sayid, I’ve never stopped shaking my head over the Temple (I bet poor John Hawkes hasn’t, either), and the memory of the final gathering in the church still leaves me stunned. How could we slip so far from the gonzo poetry of frozen donkey wheels to the high school notebook curlicues of soul mates ascending to heaven? It was the sight of a towering soufflé collapsing, at the final moment, into mushy, unbaked batter.2 But once I began rewatching the series from the beginning, the lump of disappointment I felt returning to my chest took a different form. As the French Lady’s voice crackled on the radio, as all the Desmonds, Faradays, and Juliets still to come beckoned, I realized that I would absolutely submit to the full six seasons yet again, even with the knowledge that the final step would be sideways instead of satisfying. The frustration had changed. I wasn’t upset with what Lost became. What really rankles is that nothing ever took its place.

Thanks to a quirk of timing and fate that would give even that old meddler Jacob pause, Lost managed to be both the first series to demonstrate the potential of a broadcast network in the digital age and the last. Though it was stuffed with sci-fi nerdery and smothered in a thick Bolognese of strangeness, the show was a phenomenon from the moment it debuted (to an audience of 18.65 million) all the way up through that last walk into the light (13.57 million). More than gaudy ratings, though, what Lost inspired was a very specific, highly contagious kind of mania. It arrived at a moment when Wikipedia-size wormholes were available to every viewer, when fan engagement migrated from the fringes to the very center of mainstream conversation. Excited by the sight of three-toed statues, inflamed by the mere mention of 19th-century slaving ships, otherwise sane individuals found themselves spelunking merrily into the deepest and darkest of Internet caves. Even today I can speak extemporaneously on the significance of Tawaret, the Egyptian goddess of fertility, or the philosophical musings of John Locke.3 I can prattle on endlessly about constants and variables. I know what an outrigger is. You could say that this intellectual and thematic sprawl didn’t add up to much, but you’d be wrong. What do you get by gobbling up everything under the sun? An entire, crazy world.

Thanks to this voraciousness, Lost bridged the Internet divide between the time Before Twitter (B.T.) and After Twitter (F.M.L.). It helped to normalize the idea that television can be watched intimately with millions of people not currently seated on your couch and that episodes don’t end when the credits roll — they stretch and bleed into the rest of the week through a dizzying scrim of chat windows, status updates, and ill-advised Googling. Over at Entertainment Weekly, critic Jeff Jensen gave in to the vapors so entirely that he single-handedly changed my understanding of what a TV review could be. Sure, Alan Sepinwall and others were already recapping. But Jensen used each episode as a trampoline for his wildest theories and infectious, boundless enthusiasm. In his virtuosic morning-after ramblings, Doc Jensen wasn’t just commenting on what the show was. He was delighting in all the incredible things it could be. The truth is, Lost diehards — and I count myself among them — would never have been satisfied with the show’s ending, no matter what form it took, because it pulled the plug on our endless, joyous speculating. If we’re being honest, none of us ever wanted to be found.

Lost was more than a TV show. It was a sort of shared madness, a delirium that ranged far beyond Wednesday nights at 10. And, as such, it should have heralded a new golden age for the graying networks. During Lost’s reign, cable channels were still focused on the highbrow character dramas that had earned them buckets full of press and prestige — not to mention ratings that threatened to catapult them into the biggest of leagues. (The Walking Dead premiered five months after Lost went dark. Game of Thrones arrived the following April. Together, they would push cable into an entirely different sport.) Then, as now, networks needed to operate on a larger playing field both to differentiate themselves from their more nimble cable competitors and to sustain their far more demanding revenue model.4 A wholly original multimedia supernova like Lost isn’t easily replicated. But what’s most disheartening today is to see how little the big four seem inclined to try. After a few years packed with soulless cover versions like The Event and The Nine (more on those below), network executives threw up their hands and moved on: Lost was sui generis. Like the wreckage of Oceanic 815, its particular blend of wild art and savvy commerce could never be located again. To look at the broadcast grid in the fall of 2014 is to see abject surrender; outside of a few hardy survivors,5 every network’s drama slate is a vast and exhausting sea of tired procedurals, preexisting properties, and unambiguous crap. Unless they’re plotting musicals, no one thinks big anymore. Honestly, there’s little evidence anyone is thinking at all. How could it be that such a hugely important show cast no shadow?6 Lost was meant to be an antidote to network TV’s slow descent into redundancy. Instead, it helped hasten the patient’s demise.

But if Lost taught me anything, it’s that time travel isn’t just possible, it’s downright necessary. Only by truly examining where we’ve been can we make the adjustments to get us where we need to be. Ten years removed from the pilot, it’s clear that the television industry learned all the wrong lessons from Lost’s success. (Among them: Josh Holloway can be a leading man.) With that in mind, here’s my list of six essential things that Lost could still teach the broadcast networks. Don’t fight it. It’s time to go back.


1. Characters First, Concept Second

This would seem like a no-brainer, but then you remember FlashForward. That high-concept, low-IQ failure from 2009 was only one of a whole flock of series green-lit in Lost’s wake, nearly all of which fundamentally misunderstood the show’s appeal. It was never about the island; it was always about the people. Yet again and again the networks tried to reverse-engineer a hit by coming up with some ludicrously unsustainable conceit — in FlashForward it involved a mass hallucination of a day six months in the future; IRL the show was canceled before even getting there — and then attempting to fill it with compelling characters. As if an audience could ever care about the fate of a world populated solely by cardboard cutouts.

Better to take a page from Lost’s actual playbook and remember that the development process for the show was so ridiculously sped up that Abrams and Lindelof started casting before they even had fixed ideas about the characters the actors would be playing. This could have been disastrous but proved to be liberating. When Yunjin Kim came in to read for Kate, the producers were so thrilled that they created a new role just for her. This, in turn, led to the creation of Daniel Dae Kim’s Jin. Dominic Monaghan was so impressive in his audition for Sawyer (then described as an Armani-clad big-city con man) that they rewrote the role of Charlie (intended to be an aging, over-the-hill rocker) to better suit him. Sayid wasn’t in the original outline at all. After meeting Naveen Andrews, Abrams and Lindelof crashed him into the cast.

In the best writing, action emerges from character, and not the other way around. If you can create compelling protagonists, you can do almost anything with them without breaking faith with the audience. They can jump through time, they can jump into bed with each other, they can even be locked in circus cages and fed fish biscuits. A strong character forgives all manner of foolish plot decisions made along the way. Don’t start with the mystery box. Start with the woman trapped inside it and build out from there.

2. Embrace the Division of Labor

Look, I’m all for the auteur theory in practice. Nearly all of the great dramas of the past decade-plus have been the passion projects of lone, particularly impassioned creators: your Matthew Weiners, your Vince Gilligans, your David Simons. But network TV doesn’t operate like this and never has. Highly paid executives are deeply suspicious of overly empowered writers — one ill-advised flight of fancy can crash an entire network’s bottom line — and the demands of the long network season require more than one set of hands on the controls. Rather than buck against this humbling reality, creators ought to embrace it and look to Lost as a reason why.

The premise of the show was conceived in 2003 by ABC’s then president, Lloyd Braun, while he was vacationing in Hawaii. After Jeffrey Lieber, the first writer brought onboard to take Braun’s Mad Lib of an idea — Cast Away meets Survivor? — failed to excite the network, the scraps of his idea were brought to J.J. Abrams, who tore it up and started again. (Don’t cry for Lieber: Though none of his central ideas was used, the Writers Guild decided that he still deserved credit for his work on the project. That “created by” credit had to have been worth millions and millions of dollars.) I don’t know if such strong, distinctive work usually comes out of creativity by committee. But what is clear is that the lack of total ownership over the idea allowed Abrams and Lindelof to work fast and loose in a way they might not otherwise have managed. (For proof of this, try watching Lindelof’s The Leftovers and feel as the words “fast” and “loose” turn to ashes in your cigarette-stained mouth.) Since Lost didn’t begin as their baby, there was no need to coddle it. Spurred by their audacity, Lost grew up hard, fast, and very, very odd. (Maybe cry a little for Braun: He was fired a month before Lost premiered, partly because he had sunk so much money into such a nutty pilot.)

I’m not saying that the Jason Katimses and Shonda Rhimeses and Kyle Killens of the world, all the noble scribblers dedicated to working under the (extremely lucrative!) broadcast yoke, ought to give up their personal projects and start taking their marching orders from the men upstairs. But I am saying it might not be so bad for them to try bouncing their creativity off something other than their own reflections now and again. Does NBC’s Bob Greenblatt have a certain type of show he’s hankering to do? Is the ghost of Kevin Reilly still whispering buzzwords from Fox’s air-conditioning vents? I’m not entirely sure what Paul Lee thinks about, but I can promise you that, like Lloyd Braun, he has vacationed in Hawaii. Sometimes the best way to reach that untapped potential is to give someone else a chance to do the tapping.

3. Don’t Self-Segregate

Here’s the beauty of Lost: There are polar bears, flashbacks, bursts of electromagnetism, and a giant, tree-crashing, human-smashing monster in the pilot. Within a year, there would be a hippie cult, a torture room, and a set of magical numbers that appears to control the universe. By the end of Season 5, a time-traveling fertility doctor used a giant stone to bash a hydrogen bomb until it exploded. The end of the show hinged on a pair of godlike brothers squabbling over an immortal deckhand and which one of them Allison Janney loved more.

And yet during all of this, Lost carried itself like a fully mainstream entertainment. Even midway through the third season, after the show secured its end date and committed more fully to the genre looniness that had been lurking beneath the surface, Lindelof and his fellow showrunner, Cuse, never stopped projecting to the furthest reaches of the peanut gallery. Lost was a big, bold show that always sought the largest possible audience. Its most extreme Arthur C. Clarke indulgences were always leavened by a generous dash of Danielle Steel. I’m tired of geek-minded shows self-exiling themselves to the margins as does this year’s Constantine, which will air Friday nights, where its audience of exactly who you’d expect will be waiting to embrace it with open arms. Lost proved that there were viewers out there willing to accept all kinds of extreme stories as long as they were well told. Unlike the highly specialized series on cable, broadcast shows should always aim for the biggest possible tent. And creators should remember that tents that large require equally enormous stakes.

4. Don’t Rush

Here’s the beauty of the first hour of Lost: We catch a glimpse of every single major cast member. But we really meet only three of them: Jack, Kate, and Charlie. That’s it. Everyone else is forced to wait their turn.

What a luxury it is to bask in the not-knowing! And, also, what an anachronism. Every pilot I saw this fall felt the need to present every single character in the first few minutes, often in the most honking, unsubtle way possible. Forever’s chatty protagonist introduces himself through clunky voice-over. State of Affairs’s heroine does the same by womansplaining herself to a shrink (“Total slob in my personal life, total sniper in my professional one”). On Madam Secretary, the president of the United States drives to a horse farm to tell Téa Leoni that she doesn’t just think outside the box, she doesn’t even know there is a box. (Tell that to John Locke!) Networks are so fearful that a viewer might become confused that they’ve encouraged writers to spend their time connecting dots, not developing story. The truth is, it’s always far better to reveal personality through behavior — in the pilot, Jack doesn’t tell us who he is, he shows us by rushing silently from casualty to casualty while fiery debris rains down all around him — and to unveil people gradually. Episodes can be binged but great characters ought to be savored.


5. Never, Ever Stop

Congratulations! If you’ve made it this far, I’m assuming you, along with the now-fired president of your network, have created a stunning, wide-screen drama that has captivated the nation’s imagination. This is no small achievement and you ought to take a moment to bask in it. I’m sure your children will appreciate their private school educations even though you won’t be seeing them outside of holidays for the next seven years.

Moment’s over. Now it’s time to heed another central lesson of Lost: Never, ever stop. That second when you think you’ve settled into exactly what your show is and how it works? That’s the precise moment when you need to detonate a hydrogen bomb — metaphorically or otherwise. Think back to how we first met our castaways: confused, bleeding people huddled on a beach, clinging to one another for dear life. Now think about all that was yet to come: the hatch, the Others, the Dharma Initiative, the Freighter folk, Ajira Airways, and whoever this dude was supposed to be (honestly, still wondering). It’s both astonishing and unique that the very best episode of the series, Season 4’s “The Constant,” was a time-twisting love story that hinged on the strong emotional connection viewers had formed with three characters, Desmond, Penny, and Faraday, who hadn’t even been dreamed up when the show began. All successful TV shows expand as they age, adding to the cast, setting, and themes. And Lost was certainly no exception. But what made the show special was the way it dug deeper, finding new layers of possibility and demented mythmaking just below the surface of what was already there. The moment you become complacent is the moment you’ve — and I really apologize for saying this — lost. (Helpful hint: If you’re stuck for inspiration, just try rewatching the first season of your show but this time imagine a hirsute, Mama Cass–loving Scotsman going about his daily business underneath all the action. It really makes a difference!)

6. Have Fun

Dramatic TV in 2014 is, nearly without exception, punishingly grim. The goal of most series, from the ones I admire to the ones I dislike, appears to be the observation of humans at their absolute worst; broken people pushed to their breaking point and beyond. Lost, of course, began with a catastrophe, a hideously violent plane crash that killed dozens, wounded more, and wrenched more than 40 people away from their normal lives. But from that fire emerged the unmistakable, and unkillable, spark of life. As I’ve written many times over, suffering is only one part of the human experience. To deny the desire to laugh, even in the face of death, is to misrepresent a fundamental aspect of who we are. And so for every Jack, angrily crying at fate and his father, Lost gave us a Hurley, sweetly building a golf course to lift people’s spirits.7 In Benjamin Linus, it gave us a villain who couldn’t stop smirking — as if the universe were a cruel joke that only he could appreciate. And in Sawyer, it gave us a deeply damaged man who fended off his demons with sarcasm far more often than his fists. Sawyer didn’t just laugh in death’s face. He literally shared a beer with him. Lost was a show devoted to many things, some of which worked, some of which exploded more messily than Leslie Arzt. But above all else it was devoted to a certain kind of pleasure, the kind only found in community. Lost, at its best, celebrated the joys of living together. It’s always more satisfying than dying, or even just watching, alone.

Why Right-Wing Christians Think They’re America’s Most Persecuted

Why Right-Wing Christians Think They’re America’s Most Persecuted

By Valerie Tarico

The roots of this absurd belief can be traced back to ancient times.

recent Pew study found that white American evangelical Christians think they experience more discrimination than blacks, Hispanics, Muslims, atheists or Jews.


Christianity is the majority religion in the U.S. Many kinds of  legally ensconced religious privileges are on the rise  including the right to woo converts in public grade schools, speculate in real estate tax-free, repair religious facilities with public dollars, or opt out of civil rights laws and civic responsibilities that otherwise apply to all. By contrast atheists are  less electable than even philanderers, weed smokers or gays; Hispanics and Muslims are being told to leave; Jews get accused of everything from secret economic cabals to destroying America’s military; and unarmed black youth continue to die at the hands of vigilantes.

Given the reality of other people’s lives, a widespread evangelical perception of their group as mass victims reveals a lack of empathy that should give thoughtful believers reason to cringe. And indeed, Alan Nobel, managing editor of  Christ and Pop Culture, and a professor at Oklahoma Baptist University, wrote a  thoughtful, pained analysis this summer of what he called “evangelical persecution complex.” Nobel contrasted the privileged position of American Christians with the real and serious persecution Christian minorities experience under ISIS, for example, and he examined the ways in which victimization can become a part of Christian identity and culture to the detriment of Christians and outsiders alike. What he neglected to spell out clearly was the extent to which the Bible itself sets up this problem.

Christianity, born in the harsh desert cultures of the Middle East, got its start by defining itself in opposition to both Judaism and the surrounding pagan religions of the Roman empire. Consequently, from the get-go teachings emerged that helped believers deal with the inevitable conflict by both predicting and glorifying suffering at the hands of outsiders. Indeed, persecution was framed as making believers more righteous, more like their suffering savior. Long before the Catholic Church made saints out of martyrs, a myriad of texts encouraged believers to embrace suffering or persecution, or even to bring it on.

This sample from a  much longer list of New Testament verses about persecution (over 100), gives a sense of how endemic persecution is to the biblical world view.

  • I am sending you out like sheep among wolves. Therefore be as shrewd as snakes and as innocent as doves. Be on your guard; you will be handed over to the local councils and be flogged in the synagogues.  Matthew 10:16-17
  • Brother will betray brother to death, and a father his child; children will rebel against their parents and have them put to death. You will be hated by everyone because of me, but the one who stands firm to the end will be saved. When you are persecuted in one place, flee to another.  Matthew 10:21-23
  • You must be on your guard. You will be handed over to the local councils and flogged in the synagogues.  Mark 13:9
  • Blessed are you when people hate you, when they exclude you and insult you and reject your name as evil, because of the Son of Man.  Luke 6:22
  • If you belonged to the world, it would love you as its own. As it is, you do not belong to the world, but I have chosen you out of the world. That is why the world hates you. Remember what I told you: “A servant is not greater than his master.” If they persecuted me, they will persecute you also.  John 15:19-20
  • Indeed Herod and Pontius Pilate met together with the Gentiles and the people of Israel in this city to conspire against your holy servant Jesus, whom you anointed.  Acts 4:27
  • Then the high priest and all his associates, who were members of the party of the Sadducees, were filled with jealousy. They arrested the apostles and put them in the public jail….They called the apostles in and had them flogged. Then they ordered them not to speak in the name of Jesus, and let them go.  Acts 5:17-18,40
  • On that day a great persecution broke out against the church in Jerusalem, and all except the apostles were scattered throughout Judea and Samaria. Godly men buried Stephen and mourned deeply for him. But Saul began to destroy the church. Going from house to house, he dragged off both men and women and put them in prison.  Acts 8:1
  • Who shall separate us from the love of Christ? Shall trouble or hardship or persecution or famine or nakedness or danger or sword?  Romans 8:35
  • That is why, for Christ’s sake, I delight in weaknesses, in insults, in hardships, in persecutions, in difficulties. For when I am weak, then I am strong.  2 Corinthians  12:10
  • For it has been granted to you on behalf of Christ not only to believe in him, but also to suffer for him.  Philippians1:29
  • Now I rejoice in what I am suffering for you, and I fill up in my flesh what is still lacking in regard to Christ’s afflictions, for the sake of his body, which is the church.  Colossians 1:24
  • For you, brothers and sisters, became imitators of God’s churches in Judea, which are in Christ Jesus: You suffered from your own people the same things those churches suffered from the Jews who killed the Lord Jesus and the prophets and also drove us out. They displease God and are hostile to everyone.  1 Thessalonians 2:14-15
  • In fact, everyone who wants to live a godly life in Christ Jesus will be persecuted, while evildoers and impostors will go from bad to worse, deceiving and being deceived.  2 Timothy  3:12
  • Consider him who endured such opposition from sinners, so that you will not grow weary and lose heart.  Hebrews 12:3
  • But even if you should suffer for what is right, you are blessed. “Do not fear their threats; do not be frightened.”  1 Peter  3:14
  • Dear friends, do not be surprised at the fiery ordeal that has come on you to test you, as though something strange were happening to you. But rejoice inasmuch as you participate in the sufferings of Christ, so that you may be overjoyed when his glory is revealed. If you are insulted because of the name of Christ, you are blessed, for the Spirit of glory and of God rests on you.  1 Peter 4:12-14
  • Do not be surprised, my brothers and sisters, if the world hates you.  1 John  3:13
  • Do not be afraid of what you are about to suffer. I tell you, the devil will put some of you in prison to test you, and you will suffer persecution for ten days. Be faithful, even to the point of death, and I will give you life as your victor’s crown. Revelation 2:10
  • I saw thrones on which were seated those who had been given authority to judge. And I saw the souls of those who had been beheaded because of their testimony about Jesus and because of the word of God. They had not worshiped the beast or its image and had not received its mark on their foreheads or their hands. They came to life and reigned with Christ a thousand years.  Revelation 20:4

As any squabbling pair of siblings can tell you, claiming to be a victim is powerful stuff, even if you actually struck first.  He started it! yells one kid.  No, she started it! yells the other. Parental resolve waivers in the face of uncertainty, and both kids get an exasperated lecture.


When I was in college, I had a friend who grew up in a rough, low-income neighborhood. One day we were talking about car accidents and he said, “My father told me that if you ever get in an accident, you should immediately get out and start yelling at the other driver. Even if it was your fault, it will put them on the defensive and keep them from making wild claims. And maybe the police will believe you.” Amoral, perhaps but brilliant.

If claiming to be a victim is powerful,  believing you are a victim is far more so, again regardless of the actual facts—which, at any rate, we all are prone to interpret through a self-serving lens. Have you ever noticed that when your friends tell you about conflict with co-workers or lovers, you almost always feel like they were wronged? What are the odds, really? Seeing ourselves and our tribe as innocent victims draws sympathy and support and protects self- esteem.

But at a price.

Because when we cultivate the sense that we have been wronged, we can’t see the wrong that we ourselves are doing. We also give up our power to make things better. If people keep being mean to us through no fault of our own, we’re helpless as well as victims, at least in our own minds. You can’t fix what you can’t see.

In the case of Christianity, the theology of persecution serves to give the faithful hope. It inspires persistence in the face of hardship, including the many hardships that life brings all of us through no fault of our own. But it has also blinded generations of believers to the possibility that sometimes the hardships they face are due not to their faith or outsiders hating Jesus, but to the fact that  they hit first. And sometimes the bewildering hostility they perceive may simply be something the theology of persecution has set them up to expect, whether it is there or not.


How Dinosaurs Set Up an Avian Explosion

How Dinosaurs Set Up an Avian Explosion

by Brian Switek

If you were to take a stroll through the Late Jurassic forest, roundabout 150 million years ago, you might spot little feathery dinosaurs hopping through the undergrowth. They’d look like miniature Velociraptor, complete with tiny sickle claws held off the ground. One might even briefly flutter in the air to nab an insect, or take a short glide down off a toppled tree trunk. And that would offer a critical clue to their real identity. They wouldn’t look very different from the famous “raptors”, but these meek dinosaurs mark the beginning of where birds start.

That’s what University of Edinburgh paleontologist Stephen Brusatte and his coauthors found when they looked at the big picture of bird evolution, throwing in a little scifi speculation along the way. In a new Current Biology paper on the grand evolutionary transition between birds and dinosaurs, the researchers write “we surmise that a Mesozoic naturalist would make no immediate distinction between a Velociraptor-type animal and an Archaeopteryx-type animal.”

The origin of the first birds is one of the most celebrated evolutionary transitions. Scores of fossils – with more found all the time – have confirmed that birds are dinosaurs. That’s why it might seem a little counter-intuitive that Archaeopteryx and other early birds were not very much different from their dinosaur ancestors.

After cataloging 853 skeletal characteristics in 150 dinosaurs and analyzing the rate at which these characters change, Brusatte and coauthors found that “there was no grand jump between nonbirds and birds in morphospace.” To put it another way, there was less difference between Velociraptor and Archaeopteryx than between other closely-related groups of dinosaurs, such as the parrot-like oviraptorosaurs and bizarre therizinosaurs.

The relationships of coelurosaurian dinosaurs, including birds. Courtesy Stephen Brusatte.

The only sticking point is that there are now so many bird-like nonbird dinosaurs that determining which group the first birds evolved from is a tricky task. Dromaeosaurids and troodontids – roughly, “raptors” that were quite similar to each other – are the strongest contenders. More fossils and new analyses will be needed to parse the split, but this is a happy problem to have. The exact jumping-off point for birds is so difficult to pin down simply because there are so many feathery dinosaurs perched right near the split. In fact, the problem mostly exists because the earliest members of that bird stem are the spitting image of their forebears.

Modern birds are quite different from any other vertebrates alive today. They seem that way because all of their close relatives are extinct, masking the fact that many of their unusual traits actually have a very deep history. A wishbone was a widespread trait among theropod dinosaurs. Allosaurus, a dinosaur not particularly close to birds, had one. The same goes for the air sacs that extend from the avian respiratory system. They were present in theropods as well as the long-necked sauropods, pointing to a common origin more than 70 million years before the first birds. Feathers are quite ancient, too, with an accumulating number of finds hinting that some kind of feather-like body covering might have been present in the very earliest dinosaurs, or at least evolved several times early in their history.


These traits – as well as a reduction in size and some incipient aerodynamic abilities – culminated in dinosaurs like Archaeopteryx, reaffirmed by the new study as the most archaic known member of the bird lineage. (In technical terms, this makes Archaeopteryx an avialan, or on the “stem” leading up to Aves.) From there, though, birds evolved faster than any of their close dinosaurian relatives. They proliferated into new forms and molded new niches, including toothy little flappers and loon-like diving birds by the Late Cretaceous, and continuing after the great K/Pg extinction until today.

The spark for this evolutionary explosion isn’t yet known. Flight could be a major factor, allowing birds to be adapted in starkly different ways from their earthbound ancestors. (Although, of course, nonavian dinosaurs like Microraptor found their own way into the air.) Paleontology thrives on such mysteries. For now, though, the new study underscores the fact that the change from dinosaur to bird is one of the most surprising and best-documented evolutionary transformations of all time. There is no sharp dividing line between dinosaur and bird. “Birds”, Brusatte and coauthors conclude, “are a contiuum of millions of years of theropod evolution.”

Homuncli, Golems, and Artificial Life

Homuncli, Golems, and Artificial Life

By Gary Lachman

The notion of “man-made humans,” or other living creatures fashioned by human hands, has a long history in mythology and folklore. In recent years, with the development of genetic engineering, virtual reality, and artificial life of various sorts, it has gained a new significance. But our current fascination with—not to mention dread over—the increasing likelihood of genetically modified and artificial humans is not, in essence, a particularly new development. It touches on some of the central themes of religion and the occult and magical practices that emerged from a once-powerful but now submerged spiritual belief

The Kabbalah, for example, includes legends and stories about the alchemical homunculus, or “little man,” and the golem, a kind of proto–Frankenstein’s monster. In both cases the idea is that through certain secret magical practices, human beings can share in the creative power of God. To the orthodox believers of both Judaism and Christianity such a notion is considered blasphemous and betrays either the hubris of humanity or the work of the devil. How much the orthodox misunderstanding and rejection of these ideas helped to distort them is unclear, and space and time prevent me from exploring this question. Although ostensibly concerned with very similar objectives—the creation of an “artificial man”—the alchemical homunculus and the kabbalistic golem are quite different. The popular understanding of these esoteric themes has for the most part focused on a literal interpretation, and their resurgence in our contemporary consciousness threatens to take that literalism seriously.

Prior to the rise of science and the mechanical vision of human life and the universe, the idea of creating human simulacra had a strong organic foundation. The homunculus was something one grew; the popular belief was that homunculi could be grown from the mandrake root, whose shape lent itself to anthropomorphic speculation. The golem, too, although not quite as organic as the homunculus, was nevertheless not pieced together bit by bit, as Mary Shelley’s monster would be; it was fashioned, molded from clay or soil and then miraculously brought to life.

To be sure, the prescientific age had mechanical marvels as well. Hero of Alexandria in the second century wrote manuals on how to construct moving god images and other automated devices. Using steam and sand, Hero was able to animate singing mechanical birds, to rotate statues, and to power a miniature puppet theater. There is evidence that such mechanical wonders were used as much for entertainment as for religious purposes. And we also know that animated statues played an important part in the religious rites of the NeoPlatonic schools of late antiquity, a practice that resurfaced in the folk traditions of the Middle Ages. Pope Sylvester II was said to have consulted a mechanical “talking head,” and the same was said of the monk Roger Bacon and the Dominican friar and natural philosopher Albertus Magnus.

As Victoria Nelson shows in her fascinating book The Secret Life of Puppets, this tradition of animated god images carried on in the popular fascination with puppets. The ancients, however, didn’t view their animated images as human simulacra but more as a kind of magical magnet used to attract divine energies. To animate a god image was to perform theurgy, to create the god, to bring the god to physical manifestation. For ancients like the philosophers Plotinus, Proclus, Porphyry, and Iamblichus, this meant drawing down the god-force that resided in the stars and embodying it in the image of the god. Although this was a form of “giving life” to inanimate objects, it was concerned not with creating humans but with making the divine present.

The question arises then: What is the homunculus and what is the golem? Franz Hartmann’s 1896 Life of Paracelsus defines homunculi as “artificially made human beings, generated from the sperm without the assistance of the female organism (black magic.)” The Swiss alchemist Theophrastus Bombast von Hohenheim, otherwise known as Paracelsus (1493–1541), is recognized by many as an early master of holistic medicine and natural healing. It was from Paracelsus that Goethe, a great reader of alchemical and occult literature, got the idea of the homunculus which he used in the second part of Faust. Paracelsus offered a complete recipe for creating a homunculus:

If the sperma, enclosed in a hermetically sealed glass, is buried in horse manure for forty days, and properly magnetized, it begins to live and move. After such a time it bears the form and resemblance of a human being, but it will be transparent and without a body. If it is now artificially fed with the Arcanum sanguinis hominis until it is about forty weeks old, and if allowed to remain during that time in horse manure in a continually equal temperature, it will grow into a human child, with all its members developed like any other child, such as could be born by a woman; only it will be much smaller. We call such a being a homunculus, and it may be raised and educated like any other child, until it grows older and obtains reason and intellect, and is able to take care of itself.

Hartmann notes that Paracelsus has been taken to task for believing in the literal creation of such a being, but in Paracelsus’s defense, he offers a story purporting to give evidence for the reality of such things. It’s easy to assume that Paracelsus was taken in by the common, literal understanding of what the homunculus is. But there’s also the possibility that Paracelsus was aware of this understanding and used the superstition to communicate secret teachings. References to the need to bury the sperm in horse manure, to keep it there for forty days, and to feed it with the Arcanum sanguinis hominis, the “secret blood of man,” suggest that Paracelsus may have been making reference to mythic rather than literal ideas.

Ronald D. Gray, in his book Goethe the Alchemist, argues that there’s a great deal of evidence showing that the homunculus was one of many names used by the alchemists to designate the secret aim of the alchemical Great Work. To most of us, alchemy is a primitive forerunner of chemistry, and if we know anything about alchemy it’s that it was concerned with turning lead into gold. Many calling themselves alchemists convinced themselves and many others that this was indeed the aim of the Royal Art and that it was possible. Many sought the secrets of alchemy out of sheer greed, and many would-be alchemists found a comfortable niche or, perhaps more often, an undesirable end, in the employ of a king or queen.

But there’s another way to read the alchemical project, and that is that the transformation had more to do with the alchemists themselves than with a lump of metal. Turning lead into gold was a symbolic way of describing the true aim of alchemy: the spiritual transformation of the alchemist. If one takes the time to read the alchemical literature, it’s easy to come away feeling absolutely muddled. Strange creatures, impossible landscapes, paradoxes, and downright illogic seem to dominate; the closest thing to any modern is the writings and art of the surrealists, who, ironically, looked to the alchemists for inspiration or interpretation of dreams.

It is in the psychological literature of the last half century, especially in the Jungian school, that we find great correspondence with alchemical thought. The true goal of the alchemists, the real aim of all the preparation and cumbersome apparatus, was to unite their earthly, mortal soul with that of the Creator, to participate in the divine, to reawaken their spiritual consciousness, and to grasp the secret forces at work behind the natural world. In this the alchemists carried on the same work as their Neoplatonic forebears.

Success in this work depended on following the proper procedures, which included astrological concerns, exemplifying the alchemist’s belief that the cosmos was a unified whole and that each part of it embodied the divine force animating everything. For the alchemist, matter was not the dead, inert stuff it is for us: it was a living body, one that could respond to a person’s attention. As the alchemists transformed the matter in their alembic through the alchemical process, their own inner world experienced similar changes. The entire process centered on the idea of rebirth. The alchemists were to “die” in a sense—to lose their earthly, mortal being—and, if the procedure was successful, would be reborn.

Death was an essential aspect of the alchemical process; it was out of death that new life could emerge, as it did in the Frankenstein’s monster. In Paracelsus’s recipe for the homunculus, the horse manure represents the putrefaction needed to begin the process of rebirth. This is the first step in the alchemical work. The old self, the old Adam, must be broken down until we arrive at the prima materia, the primordial stuff, the unformed matter out of which any future creation can take place. The forty days in which the sperma is buried in the horse manure parallel Christ’s forty days in the desert, when he is tempted by Satan. This means that the alchemist must undergo trials, must endure some suffering, and that the alchemical process is not something going on outside of oneself but is something that must be lived through. This is also suggested in the idea that the homunculus, the little man who is the alchemist reborn, must be fed by the alchemist’s own secret blood. The alchemist’s attention, concentration, mind, or soul must be completely focused on the task variously known as the creation or discovery of the philosopher’s stone, the elixir of life, potable gold, the universal solvent, and, very often, the creation of the homunculus depicted in numerous alchemical illustrations, often as the god Mercury encased in the alchemical vessel.

That the alchemists would speak of this in parable, allegory, and obscure language shouldn’t be surprising. It’s difficult enough for us, who have the advantage of familiarity with self-help and psychotherapeutic literature, to grasp the meaning of rebirth. For the literal-minded of the Middle Ages, who were taught that all magic and occult knowledge was the work of the devil, this would be a subtle notion indeed. The idea that by going through the alchemical rebirth, one would become as Christ—regenerated—would strike them as blasphemous. What was left was the literal idea of making an actual man or woman, just like the idea of making actual gold from lead or finding an actual stone. Yet a famous alchemical maxim reads: “Our gold is not the vulgar gold.” Clearly, making material gold was not what they were after. Creating an actual tiny human being was always recognized as a display of power that went beyond nature. This is a dim and distorted echo of the alchemists’ belief that their art was against nature in the sense that it both sped up a natural process and redeemed its practitioners from a life lived solely at the natural, Adamic, unregenerate level.

The legend of the golem has also suffered from a too-literal interpretation. Probably the most well known version of the golem story is Gustav Meyrink’s classic expressionistic novel The Golem, published in 1915. Several film versions of the golem story have been made; the best-known is probably Paul Wegener’s 1920 version. In the first film to deal with the theme, Otto Rippert’s 1916 Homunculus, a scientist creates an artificial man and endows him with more than human powers. When this superman discovers his true origin—that he is not human at all and can never feel love—he reacts violently and inaugurates a reign of terror that leads to his destruction. This notion of a lack, of something missing, also haunts homunculi in future storytelling.

The popular idea of the golem had its start in the 1890s, when the creature became associated with the legends surrounding the famous Rabbi Loew of Prague, an almost mythic figure of the sixteenth century. In one version, Rabbi Loew creates the golem to protect the Jewish population of Prague from one of Emperor Rudolph II’s pogroms. Prague is perhaps the most occult and alchemical city in Europe; aside from the golem legends, it has a long tradition of puppets, dolls, and magic shows of various kinds.

Although the popular idea of the golem is associated with the magical powers of Rabbi Loew—and there is no evidence that the rabbi himself ever attempted to make a golem—the term has a long. if obscure history in Talmudic literature. The word golem is mentioned once in the Bible, in Psalm 139; today it’s often translated as “embryo.” Golem itself means “unformed”; it’s the hyle of the ancients, the chaotic, inchoate state of matter before it is given form by the Creator. The similarity between this and the alchemical prima materia seems clear. In the Talmudic Aggadah, Adam is referred to as “golem.” In a midrash from the second and third centuries, Adam is described as a kind of cosmic golem, an immense being whose body is as large as the universe and who can see the entire history of the world, its past and future—an echo of Madame Blavatsky’s akashic record.

This description relates to the kabbalistic idea, also shared by hermetic, alchemical, and Gnostic beliefs, that the universe itself is a kind of man, Adam Kadmon, and that each of us is a microcosm, a universe in miniature: the universe is a Great Man, and we are all little universes. There is a story that when God was creating the world, he made Adam first but left him unfinished, in a golem state, fearing that if he completed him and then went on to create the universe, Adam himself might get the credit for the work (which implies something about the character of the Creator). So God left Adam unfinished, and only after creating the world did he breathe life into him. One symbolic interpretation of this story, which relates to the alchemical “little man,” is that we all are golems until the breath of the divine enters us. We are all unfinished, incomplete, until regenerated.

The kabbalist scholar Gershom Scholem tells us that “the golem is a creature, particularly a human being, made in an artificial way by virtue of a magic act, through the use of holy names.” In kabbalistic tradition, the golem, like Adam, is made of clay or soil. He is molded into human form, and then the mystical name of God, the Tetragrammaton, JHVH, is written on a piece of paper and placed on his mouth. The motif of a magical word or name shows the importance of writing and language in the Jewish mystical tradition. Kabbalah itself is a mystical interpretation of the Bible, and the interplay of words, their rearrangement into other words, and their numerical values all play an important role in understanding the secret laws behind creation. Whereas in the alchemical idea of the homunculus the alchemist himself is re-created, here the kabbalist echoes God’s creative power and creates a kind of life himself.

There is some practical value in this, in that the golem is often used as a kind of slave or worker who, takes care of many otherwise onerous tasks, similar to the modern robot or android. The golem, however, is a kind of sorcerer’s apprentice, and, as in the Frankenstein tale, the monster gets out of hand. In many versions, the golem continues to grow and grow and soon becomes too big for the magician to handle.

There are different versions about how the golem is stopped. In the most popular one, the word emeth, “truth,” is written on the golem’s forehead, and this gives it life. In order to stop it from destroying the ghetto, the magician rubs out the first letter of the word, leaving meth, which means “death.” The man of clay then tumbles to the ground and shatters. In Gustav Meyrink’s novel the golem, a metaphor of the novelist’s true self, is brought to light through the act of writing. In one of the many film versions, the golem falls in love with the magician’s daughter and, like the homunculus, turns violent and has to be destroyed. Gershom Scholem points out that, in keeping with kabbalistic tradition, the golem always lacks some essential quality. In some versions it lacks the power to speak, emphasizing that the magical power of words is reserved for God and his devout believers. In others it lacks intelligence or some other positive human quality. All golem stories, however, portray the golem, no matter how strong, as less than fully human. The imperfection of their creature shows that the magicians, no matter how knowledgeable, are still far short of God, a point that contemporary advocates of “man-made humans” may wish to ponder.


Jewish Mythological Creatures


Famous For: Being the biggest and baddest bird who purposefully dropped an egg once, which destroyed a forest and flooded 60 villages with eggy grossness.

What He Is: Ziz is the least famous of the biggest and most powerful creatures in the world. Where the Behemoth is of the land and the Leviathan takes up residence in the sea, the Ziz’s home is in the skies. He is so big that his feet can touch the ground and his head touches the sky, so how he flies is a bit of a mystery. He is sometimes grouped with the Bar Juchne, a race of giant mythical birds. Ziz protects the world from southern storms and has a wingspan so large it blocks out the sun. It shows up in the Talmud, where the excuse for throwing its eggs around was because it was rotten. The Children of Israel will have our revenge, though: according to popular legend, the Ziz, Leviathan, and Behemoth will all be killed for a feast at the coming of the Messiah.

Where Are They Now?: Ziz, like many of the other creatures on this list, can be grouped in with very similar creatures. Giant birds show up everywhere, called Rocs and Ankas, for example. The giant eagles in The Hobbit are a friendly version that saves many heroes from danger at the last possible second. As for actual fatal egg droppings, that is thankfully a rare occurrence.



Famous For: Tunnelling underground to build the First Temple in Jerusalem.

What They Are: Essentially, a shamir is a small insect that can break through pretty much any substance. When constructing the first temple, tools that could also be used for war were forbidden, so shamir were apparently used in lieu.

Where Are They Now? The seventh Prime Minister of Israel changed his name to Yitzhak Shamir, inspired by the creature. Usually, insects that are good at cutting things take the form of enemies and show up as monsters of the week pretty frequently. Sandworms are a dominant force in the Dune series of novels. The Conqueror Worm by Edgar Allan Poe relies heavily on Jewish symbolism while never referring to the shamir outright. Regulan bloodworms from Star Trek can cleanse and harm people, being both useful and dangerous.


Famous For: Possessing People. Being whiny jerks.

What They Are: Dybbuk’s are your basic possessive spirit: ghosts or spirits that occupy people’s body against their will. They are often related to souls that need help moving on and, as such, figuring out exactly what the Dybbuk wants is the best way to exorcise it. This means listening to its problems and trying to appease it, like a slightly more articulate toddler, but in the body of someone you know.

Where Are They Now?: Figuring out the requests of possessing spirits is a mainstay story in supernatural-themed television. Buffy the Vampire Slayer uses the device plenty of times but never refers the spirits as Dybbuk, even though character Willow Rosenberg is herself Jewish. The 2009 Coen Brothers film A Serious Man opens with a dybbuk possessed man killing a couple with an icepick.


Famous For: Siphoning goat’s milk by day and human blood by night. Shapeshifting.

What They Are: Broxa’s are originally a bird that liked to drink blood and steal milk from goats, which is delightful. They are basically Jewish vampire bats, but during the Middle Ages they shapeshifted into a witch or demon, primarily taking a female appearance. Why the sudden shift is a mystery to scholars, most attribute it to Jewish cultures mixing with those around them.

Where Are They Now?: Blood suckers are everywhere and none is more famous than Dracula, who could shapeshift and liked to drink blood. Vampires, especially shapeshifting ones, have dominated culture, and we are all familiar with Twilight’s sparkly versions. Vampire bats are actually real and a nuisance, but not really for humans, they prefer livestock. Broxa also bear a striking similarity to the Chupacabra, a blood sucking creature of popular legend in Central and South America.



Famous For: Being cows or possibly unicorns.

What They Are: The fact that when we hear the word unicorn and all think of a white horse with a horn is a relatively new phenomenon. Unicorns are actually a widespread mythical creature that is found all over the world and, with each iteration, the description changes quite significantly. The Jewish version is a Re’em and it is more or less a cow. It’s mentioned nine times in the Bible, sometimes referred to as an oryx or unicorn. Generally, it is considered to be an ancestor of cattle, usually called auroch, but too wild to be tamed. Some depictions give it a single horn as well. Some Creationists believe a Re’em is actually a triceratops, which would have been awesome if dinosaurs and humans lived at the same time. I bet they would have had a very gamy taste.

Where Are They Now?: Unicorns are everywhere, so are cows. In popular culture, bulky single horned animals are also everywhere and generally look like rhinos, but many big military vehicles, especially bulky naval ships, are named unicorns in reference to the traditionally, non-horse type of unicorn. Haruki Murakami’s brilliant novel Hard-Boiled Wonderland and the End of the World has a character who reads dreams from unicorn skulls, ones that are not very horselike. In Harry Potter, Re’em blood is an extremely rare substance that gives immense strength to anyone who drinks it.


Famous For: Being big and mysterious, but only to us. Apparently everyone knew about them way back when.

What They Are: Like most creatures on this list, there is a lot of speculation and not a ton of information: the Nephilim are mentioned only a couple of times and in a way that assumes you know what they are. Genesis 6:4 says “The Nephilim were on the earth in those days—and also afterwards,” which isn’t very informative.

Mostly, people believe they were giants with two possible origins. One, they are the result of fallen angels interbreeding with wicked people in the pre-Flood world; two, they are descended from Adam’s son Seth and condemned by God for rebelling. They don’t sound too friendly, being essentially the descendents of wicked watchers or failed revolters.

Where Are They Now?: Dante, the protagonist of the Devil May Cry video games, is supposedly one, although not exactly gigantic in his regular form. They also show up in the young adult book series The Mortal Instruments, where they are supernatural beings who became powerful by drinking the blood of an angel named Raphael.


Famous For: Being God’s own Secret Service. Attending almost every major Biblical event.

What They Are: Watchers are winged creatures that are basically God’s personal assistants. They fight for him; like in the War in Heaven that led to the Fall; send messages for him to humans, intervene in events on His behalf, and hang out around God all the time. They act as God’s army and are ranked as such. Cherubim, for example, hold special duties, like the ones who guarded the way to the Garden of Eden after Adam and Eve were banished. The Seraphim have six wings and constantly fly around singing praises to God. There are others as well, especially outside the canonical texts, but that could make them a Top 10 on their own!

Where Are They Now?: Watchers show up everywhere in various forms, it’s impossible to list all of them. Gandalf from The Hobbit and The Lord of the Rings is essentially a guardian angel, especially for Bilbo Baggins. Marvel Comics has all sorts of Jewish influence, and the Watcher in that universe observes events on behalf of basically an intergalactic archive.


Famous For: Being really gigantic and having the world’s most powerful belly button.

What It Is: The Behemoth and Leviathan are generally grouped together as they unconquerable creatures of their respective habitats. For the Behemoth, that’s the land. He has an extended description in Job, and highlights include his tail like cedar, limbs as strong as copper, and apparently he holds a lot of power in his loins and in the navel of his belly. He can apparently drink the entire Jordan as well, so I guess he gets pretty thirsty being the most powerful creature to roam the land. But as powerful as he is, the Job passage describes him as essentially a house pet compared to God’s power, reminding Job who is really in charge.

Where Are They Now?: The Behemoth is essentially an unconquerable monster and that actually isn’t a very popular prospect in today’s popular fiction, which prefers successful heroic victories over creatures that only God can defeat. Thomas Hobbes used it as the title to his account of the English Civil War, quite appropriately. The word is used to describe a singular object of unique power and size, usually a monster, but often the monster is defeated by a hero and not really a very good example of a true Behemoth.


Famous For: Being a really big fish that likes to swallow people and ships whole.

What It Is: The Leviathan is the Behemoth’s more famous watery equivalent, being a gigantic whale or shark that is feared by sailors around the world. He also shows up in Job with a very long description that claims the mere sight of him is overpowering. He also can shoot fire from his mouth that sets coals ablaze and dismay goes before him. Weapons are useless against him and, in one poetic moment, iron and bronze are like straw and rotten wood to him. Basically, he’s the biggest and baddest fish in the Seven Seas.

Where Are They Now?: The word has become synonymous with giant sea creature, and popular folklore loves them some big fishes. Probably the most famous Leviathan is Moby Dick, a great white whale that torments Captain Ahab. It also shows up in the Illuminatus! Trilogy by Robert Anton Wilson as tentacled pyramid. It also is the name of many a spaceship in film, television, and video games, including Star Wars, Mass Effect, and Farscape.


Famous For: Being rocks. Rocks that move.

What They Are: Golem are usually animate beings made of inanimate objects, most commonly rocks. Adam, in a way, is a golem, being made from the earth by God. They are usually created by people as a sign of devotion and to help people in various tasks. Because of their status as distinctly Jewish creatures, there are many folklore stories about rabbis who made golems throughout history. In 16th century Prague, for example, a rabbi reportedly made one to protect the Jewish ghetto from anti-semitic attacks.

Where Are They Now?: Golems show up all over the place, both traditionally as helpers and also as enemies. Living statues are in almost every movie that is remotely fantasy-based. A Golem is a central part of Michael Chabon’s Pulitzer Prize-winning novel The Amazing Adventures of Kavalier and Clay, which is just highly recommended reading. Nobel Prize winner Elie Wiesel wrote a children’s book based on the Prague golem as well. The video game Shadow of the Colossus, one of the most well received games of all time, is about a girl who must kill gigantic golems.

In Collier’s Magazine, World War III Already Happened…In 1952

In Collier’s Magazine, World War
III Already Happened…In 1952
by Ron Miller

In 1951 Collier’s magazine devoted an entire issue to reporting an imagined version of World War III. The magazine followed every detail of the conflict from the first attack to the eventual occupation of Russia by the United Nations.

Collier’s recruited some of the nation’s best-known writers and journalists to provide articles. These included Edward R. Murrow, Robert Sherwood, Lowell Thomas, J.B. Priestley, Margaret Chase Smith, Philip Wylie and Walter Winchell, among numerous other celebrity authors. To increase the sense of reality, even the magazine’s cartoons—some of which were provided by famed World War II cartoonist Bill Mauldin—and many of its advertisements were geared to reflect the “reality” of the imaginary war.

All of this was assembled under the direction of legendary Collier’s editor, Cornelius Ryan—the genius behind the seminal Collier’s space symposium and later the author of The Longest Day (1959).

The Great 1952 Space Program That Almost Was

In 1952, Collier’s magazine sponsored a gathering of
the world’s greatest space experts who, in a…Read more

Even the magazine’s obligatory short stories were romances set in the world of World War III. Collier’s also enlisted some of the country’s top illustrators to provide the visual documentation—some of which was uncannily realistic, and some outright disturbing. Not the least of these were Chesley Bonestell’s renderings of Moscow, Washington and New York during and after a nuclear bombing. The latter illustrations are disturbingly reminiscent of news photos from 9/11.

In Collier's Magazine, World War III Already Happened...In 1952

According to Collier’s, this is how World War III played out…

An attempted assassination of Yugoslavian Marshal Tito on May 10, 1952 triggers a Moscow-planned uprising in that country. Red Army troops accompanied by troops from the Soviet satellite states of Bulgaria, Romania and Hungary invade Yugoslavia. While President Truman condemns this “Kremlin inspired” action, Stalin says it is “an internal matter” and that the invasion is in fact “the will of the Yugoslav people.” Defying Truman, Moscow refuses to withdraw its troops.

In Collier's Magazine, World War III Already Happened...In 1952

The United Nations joins the U.S. in declaring war on the Soviet Union. A preemptive saturation bombing of Russia with nuclear weapons begins immediately. Needless to say, the coalition forces avoid bombing major cities and instead focus on military targets, such as factories, oil and steel refineries, and nuclear installations.

The Soviets retaliate with a nuclear-armed air force that outnumbers UN planes five to three. They attack Germany, the Baltic states the Middle East. UN troops are forced to retreat on all fronts, including a catastrophic evacuation from Korea and Japan.

While Communist cells throughout Western world begin a campaign of sabotage and open attacks, such as the detonation of a bomb in New York’s Grand Central Station, Stalin’s son, aviator General Vassily Stalin, is captured and made a prisoner of war.

In Collier's Magazine, World War III Already Happened...In 1952

The Red Army invades North America when it lands in Alaska and immediately occupies territory. Meanwhile, the Soviets drop atomic bombs on London and other coalition capitals. Atomic bombs also fall on Detroit, New York and Washington, DC.

The U.S. is bombed again the following year when Chicago, New York, Washington and Philadelphia, Boston, Los Angeles, San Francisco, Norfolk and other US cities are nuked by missiles fired from Soviet submarines. The UN finally begins to make some progress, however, and after a successful Christmas Day nuclear attack on the Red Army, UN air forces finally dominate the air over the European battle fronts.

After warning the Russian people of an imminent nuclear
attack, US planes drop atomic bombs on the Kremlin.

In Collier's Magazine, World War III Already Happened...In 1952

Shortly afterward, a suicide attack by 10,000 troops in the Ural
Mountains destroys Russia’s remaining stockpile of nuclear weapons.

In Collier's Magazine, World War III Already Happened...In 1952

The UN follows up by providing arms to resistance fighters in Russia and its satellite nations. Red troops begin to lose ground to the guerrillas. Civil uprisings occur throughout the Soviet Union. A UN offensive begins on all fronts as Soviet resistance finally begins to collapse. Eventually, the Red Army collapses as UN troops begin to occupy major cities and regions in not only the satellite states but Russia as well.

In Collier's Magazine, World War III Already Happened...In 1952

Stalin “disappears” at the beginning of the following year. Lavrentiy Pavlovich Beria, the ruthless overseer of Russia’s nuclear arms program, is proclaimed the new Premier.

The war effectively ends in 1955 as the Soviet Union
disintegrates and Moscow is occupied by UN forces.

The magazine follows up this victory with articles such as “We Worship GOD Again,” by Oksana Kasenkina (a Russian schoolteacher who had leaped from the third floor of the Soviet Consulate in NY in a bid for freedom) and “The Women of Russia.” The latter is accompanied by an illustration of Moscow’s vast Dynamo Stadium, filled with “fashion-starved Moscow women for their first style show.”

The final articles focus on the rebuilding of Russia in “A New Russia,” by economist Stuart Chase, and “Free Men at Work,” by labor union leader Walter Reuther, and the renaissance of culture in a report on a Moscow production of “Guys and Dolls” and the opening day of the 1960 Moscow Olympics, signaling “world brotherhood and good will.”

COLLIER’S: Preview of the War We Do Not Want

Collier's October 27, 1951: Preview of the War We Do Not Want - Russia's Defeat and Occupation: 1952-1960'

Collier’s magazine devoted its entire 130 page October 27, 1951 issue to an imagined World War III and subsequent United Nations occupation of Russia. It is fascinating to read how the editors and guest writers (Edward R. Murrow, Philip Wylie and Walter Winchell among many) thought WW III might unfold. It is also more than a little unsettling to flip through the peppy ads for Nash automobiles and Frigidaire refrigerators and countless other products to land on full-color renderings of Moscow and Washington in atomic flames. One illustration caption reads “Note Pentagon blazing (at upper left).” Another illustration caption reads “In an effort to terrorize people, Soviet agents planted bombs in New York’s Grand Central Terminal, killed 22. Americans were outraged.”

The magazine’s literary war game presentation is so comprehensive that it even includes illustrations by the famous World War II editorial cartoonist Bill Maudlin who created the characters Willie and Joe.”

It is interesting to note the treatment of civil defense in this speculative story and how it mirrors the conservative attitude of 1951. At first, civil defense is deemed to be inadaquate but later, after the first round of Soviet atomic attacks has taught the complacent U.S. a lesson, civil defense procedures are described as having been improved. The proof? Far fewer casualties in the second Soviet A-bomb raids than the first.

For the benefit of CONELRAD readers who will never be able to track down a copy of this masterpiece of Cold War publishing, the following is a summary of the events of the fictional war from the issue. Spoiler alert: America wins!:



Assassination attempt on Marshal Tito’s life, May 10th, precipitates Cominform-planned uprising in Yugoslavia. Troops from satellite nationsof Bulgaria, Romania and Hungary, backed by the Red Army, cross borders. Truman terms agression “Kremlin inspired.”; Reds call it “an internal matter.”

Third World War begins when Moscow, still insisting that uprising is “the will of the Yugoslav people,” refuses to withdraw Red Army units. Stalin miscalculates risk: had believed U.S. would neither back Tito nor fight alone. U.S. is joined by principal UN nations in declaration of war.

Neutrals include Sweden, Ireland, Switzerland, Eqypt, India and Pakistan.

Saturation A-bombing of U.S.S.R. begins. Avoiding completely population centers, West concentrates on legitimate targets only. Principal objectives: industrial installations; oil, steel and A-bomb plants.

Communists throughout West begin sabotage campaign. Trained saboteurs open attacks in U.S.

General Vassily Stalin, aviator son of Red dictator, becomes a UN prisoner of war.

Red Army, under vast air umbrella which outnumbers UN planes five to three, attacks across north Germany plane, in Baltic countries and through Middle East.

UN Troops, fighting for time, retreat on all fronts, suffering many losses.

North American continent invaded when Red Army, in combined air-sea operation, lands in Alaska, occupying Nome and Little Diomede Island.

Reds A-bomb London and UN bases overseas.

Far East “Dunkerque” takes place when, under unremitting air and submarine attacks, U.S. occupation forces evacuate from Korea and Japan.

U.S. A-bombed for first time when Red air force hits Detroit, New York and A-bomb plant at Hanford (Washington). Civil defense proves inadequate.

Turning point in war’s first phase reached with atomic artillery smashes enemy on Christmas Day in Europe.


U.S. A-bombed for second time. Bombers hit Chicago, New York, Washington and Philadelphia. Red submarines fire atomic-headed missiles into Boston, Los Angeles, San Francisco, Norfolk (Virginia) and Bremerton (Washington). Casualties greatly lessened by improved civil defense procedures.

UN air forces finally achieve air superiority over battle fronts.

Psychological warfare begins to play an imporant role; propaganda emphasizes that UN is fighting war of liberation for Russian people; leaflet raids and broadcasts warn Russian people to evacuate area scheduled for attack.

Moscow A-bombed midnight, July 22nd, by flying B-36s in retaliation for Red A-bomb terror raid on Washington. Planes flying from U.S. bases destroy center of Moscow. Area of damage: 20 square miles.

Suicide task force lands behind U.S.S.R. borders, destroys Soviets’ last remaining A-bomb stockpile in underground chambers of Ural Mountains. Of 10,000 paratroopers and airborne units, 10 percent survive.

UN General Assembly issues momentous war-aims statement known as “Denver Declaration.”

Underground forces in satellite countries receive arms and materials in UN plane-drops; highly trained guerrilla fighters parachute into U.S.S.R. to aid resistance movements and destroy specific targets.

Severest rationing since beginning of war introduced in U.S.

Yugoslav guerrilla fighters begin to tie down large numbers of Red troops.


A captured Soviet general reports disappearance of Stalin, reveals that MVD (secret police) Chief Beria is new Red dictator.

Uprisings take place in U.S.S.R. and satellite nations. UN parachutes Russian emigres into Soviet Union to aid dissident groups.

UN offensive begins on all fronts as West at last gains initiative.

Red Army gradually retreats, then disintegrates under onslaught of UN air and ground forces.

Three Red generals desert to UN forces.

UN armored spearhead captures Warsaw, reaches Pripet Marshes in Poland. Another armored column crosses U.S.S.R. border into Ukraine.

UN forces clear Asiatic Turkey and cross border into Crimea.

Marines, in combined air-sea operation, capture and occupy Vladivostok.


Hostilites cease as U.S.S.R. degenerates into a state of chaos and internal revolt.

UN forces begin occupation duties in satellite nations and Ukraine.

UNITOC – United Nations Temporary Occupation Command – set up in Moscow.

COLLIER’S: Preview of the War We Do Not Want

Collier’s magazine
By Various authors including Edward R. Murrow; Philip Wylie and Walter Winchell
October 27, 1951



Africa’s Last Elephants

Close-Up Aerial Photos of Africa’s Last Elephants

By Nick Stockton

Zakouma National Park in southern Chad is famous for its large, free roaming herds of elephants. This has made it a honeypot for poachers. From 2005 to 2010, demand for ivory has reduced the park’s elephant population from over 4,000 to about 450 individuals.

In a visit earlier this year, Kate Brooks took these beautiful aerial pictures of the park and its remaining elephants. Brooks is a war photographer who has spent most of her 17-year career documenting conflict in the Muslim world. She says it’s no stretch to compare the slaughter of African animals to the worst human conflicts. Her forthcoming documentary, The Last Animals, will describe the increasingly sophisticated war between conservationists and poachers over elephants, and many other African animals.

Brooks first became concerned for Africa’s wildlife in 2010 when she visited Maasai Mara, a wild animal park in Kenya. Having just finished a taxing embedded assignment in Afghanistan, “I was hoping that with those majestic creatures I could heal some,” she told WIRED. Instead, she was struck with how desperate the situation was for the animals there. In 2012, she was awarded a Knight-Wallace journalism fellowship at the University of Michigan, where she began researching her feature-length documentary project. “My research question was, ‘Can there be ecological preservation in an overpopulated world?’” she said.

Nobody knows exactly how many elephants currently live in Africa, but that the number could soon be zero. According to a recent report, roughly 100,000 elephants have been poached for their ivory since 2011. Experts estimate that about 100 elephants are killed every day, a rate that outpaces their ability to reproduce (elephants take about 22 months to give birth).

As demand for ivory has increased, the battle for the elephants’ lives has become increasingly militarized. Although many poachers still use primitive methods, such as poisoned arrows, more and more are former military or park rangers using sophisticated tactics and technology. (In 2012, members of the Ugandan military massacred a herd of elephants from a helicopter.) To defend their animals, rangers at many parks are now outfitted with assault rifles, grenade launchers, and machine guns to fend off the poachers. Some have even begun using drones to protect their animals.

Brooks’ photos of war and conflict have appeared in many publications, from TIME to The New Yorker. She also authored a book of her experiences, called In The Light Of Darkness, which chronicles 10 years she spent in the Arab world and the effects of American foreign policy.

Although she’s done previous work as a cinematographer, The Last Animals is her first time directing a documentary. The project is partially funded through Kickstarter (it met its $50,000 Kickstarter goal in January), and is in the middle stages of production. Brooks says she hopes to be done filming by late spring of 2015.