Ko and Mo
Amateur Make-Up Artist Turns Herself Into Your Favorite Pop Culture Characters
Read More at http://boredombash.com/elsa-rhae-face-paintings/ © BoredomBash
what the fucking fuck i love this
For more posts like these, go visit psych2go
Psych2go features various psychological findings and myths. In the future, psych2go attempts to include sources to posts for the for the purpose of generating discussions and commentaries. This will give readers a chance to critically examine psychology.
Okay this is the funniest thing I’ll ever see.
But What Would the End of Humanity Mean for Me?
Preeminent scientists are warning about serious threats to human life in the not-distant future, including climate change and superintelligent computers. Most people don’t care.
Sometimes Stephen Hawking writes an article that both mentions Johnny Depp and strongly warns that computers are an imminent threat to humanity, and not many people really care. That is the day there is too much on the Internet. (Did the computers not want us to see it?) Hawking, along with MIT physics professor Max Tegmark, Nobel laureate Frank Wilczek, and Berkeley computer science professor Stuart Russell ran a terrifying op-ed a couple weeks ago in The Huffington Post under the staid headline “Transcending Complacency on Superintelligent Machines.” It was loosely tied to the Depp sci-fi thriller Transcendence, so that’s what’s happening there. “It’s tempting to dismiss the notion of highly intelligent machines as mere science fiction,” they write. “But this would be a mistake, and potentially our worst mistake in history.” And then, probably because it somehow didn’t get much attention, the exact piece ran again last week in The Independent, which went a little further with the headline: “Transcendence Looks at the Implications of Artificial Intelligence—but Are We Taking A.I. Seriously Enough?” Ah, splendid. Provocative, engaging, not sensational. But really what these preeminent scientists go on to say is not not sensational. “An explosive transition is possible,” they continue, warning of a time when particles can be arranged in ways that perform more advanced computations than the human brain. “As Irving Good realized in 1965, machines with superhuman intelligence could repeatedly improve their design even further, triggering what Vernor Vinge called a ‘singularity.’” Get out of here. I have literally a hundred thousand things I am concerned about at this exact moment. Do I seriously need to add to that a singularity? (via But What Would the End of Humanity Mean for Me? - James Hamblin - The Atlantic)