Category Archives: Creativity

Paleolithic parenting and animated GIFs

The creation of the moving image represents a technical advance in the arts comparable with the invention of the steam engine during the industrial revolution.

The transition from static to moving imagery was a watershed event in human history, through which people discovered a new way of capturing the visible world — or so it seemed.

It turns out, however, that long before the advent of civilization, our Paleolithic forebears figured out that movement seen in living creatures around them could, by cunning means, be captured in crafted illusions of movement.


Let’s run with the hypothesis that this 14,000 year old artifact is indeed a toy. What does this tell us about its creator and the children for whom it was made? [Continue reading at my new site: Attention to the Unseen]

Facebooktwittermail

‘Let the soul dangle’: How mind-wandering spurs creativity

By Julia Christensen, Guido Giglioni, and Manos Tsakiris

The Renaissance painter Albrecht Dürer was regarded by his friends as a master in the art of mind-wandering. He could become ‘enwrapped’ in his own pleasant reflections, wrote the German humanist Willibald Pirckheimer, at which times Dürer ‘would seem the happiest person on Earth’.

Many of us are familiar with mind-wandering in a number of guises: procrastination, reflection, meditation, self-flagellation, daydreaming. But while some mental meandering seems fruitful, on other occasions it has the unmistakeable bite of a bad habit, something that holds us back from reaching our full potential. Reverie can be a reprieve from reality and a font of inspiration, yes. But equally familiar is the mind’s tendency to devolve into sour and fruitless rumination when left to its own devices, especially when we’re in the grip of depression, anxiety or obsession.

Can art itself be a useful catalyst for nudging us towards more helpful emotions and mental states? Whether in the form of literature, rap or abstract oil painting, many of us know we can improve the tenor of our thoughts by contemplating art. The Germans have a lovely saying for the benefits of keeping an idle (or idling) mind: ‘die Seele baumeln lassen’, meaning ‘let the soul dangle’. Now, the emerging science of neuroaesthetics is beginning to reveal the biological processes that sit behind such ‘dangling’.

Continue reading

Facebooktwittermail

Our imaginative life today has access to the pre-linguistic, ancestral mind

Stephen T Asma writes: Richard Klein, Maurice Bloch and other prominent paleoanthropologists place the imagination quite late in the history of our species, thousands of years after the emergence of anatomically modern humans. In part, this theory reflects a bias that artistic faculties are a kind of evolutionary cheesecake – sweet desserts that emerge as byproducts of more serious cognitive adaptations such as language and logic. More importantly, it is premised on the relatively late appearance of cave art in the Upper Paleolithic period (c38,000 years ago). It is common for archaeologists to assume that imagination evolves late, after language, and the cave paintings are a sign of modern minds at work, thinking and creating just as we do today.

Contrary to this interpretation, I want to suggest that imagination, properly understood, is one of the earliest human abilities, not a recent arrival. Thinking and communicating are vastly improved by language, it is true. But ‘thinking with imagery’ and even ‘thinking with the body’ must have preceded language by hundreds of thousands of years. It is part of our mammalian inheritance to read, store and retrieve emotionally coded representations of the world, and we do this via conditioned associations, not propositional coding.

Lions on the savanna, for example, learn and make predictions because experience forges strong associations between perception and feeling. Animals appear to use images (visual, auditory, olfactory memories) to navigate novel territories and problems. For early humans, a kind of cognitive gap opened up between stimulus and response – a gap that created the possibility of having multiple responses to a perception, rather than one immediate response. This gap was crucial for the imagination: it created an inner space in our minds. The next step was that early human brains began to generate information, rather than merely record and process it – we began to create representations of things that never were but might be. On this view, imagination extends back into the Pleistocene, at least, and likely emerged slowly in our Homo erectus cousins. [Continue reading…]

Facebooktwittermail

A Trump attack on the arts would be more than just symbolic

Philip Kennicott writes: For months now, the debate in the arts world has been: Will he really do it? Will Donald Trump be the president who finally gives the right wing what it has so vehemently craved for decades, the elimination of the National Endowment for the Arts? A report in The Hill suggests that pessimists, who assumed the worst once it became clear that Trump’s election would likely empower organizations like the conservative Heritage Foundation, were right. He may indeed try to kill it.

And the National Endowment for the Humanities, as well as cutting the federal appropriation for the Corporation for Public Broadcasting. The animus against these organizations has been so powerful for so long that defending them feels almost pro forma, a reflexive rhetorical blast into the headwinds of an anti-arts bias so deep that there’s little hope of changing anyone’s mind (“The NEA is welfare for cultural elitists,” declares the Heritage Foundation, sententiously).

Never mind the old arguments, still valid and cogent, but somehow threadbare from long use in what many people have long and fatalistically assumed is a losing battle. Despite the culture war clashes about art that some considered obscene more than a generation ago, the NEA has evolved into an organization that operates and has impact in every state, that has served returning veterans, bolstered state arts agencies and worked with all manner of groups and state and federal partners to build stronger and more resilient communities across the country. Never mind the role the NEH has played in the creation of documentaries and the education and enrichment of teachers who might not otherwise have a chance to escape the grinding cycle of teaching to the tests, which never stop coming. Never mind that the Corporation for Public Broadcasting creates the best and most enriching programming for children that is widely available without cost to the poor and the isolated.

No one knows what Trump will do until he actually does it, so perhaps someone will get his ear and deflect support for these cuts–so minimal in real dollar terms, so significant in symbolic impact. But let’s assume they’re coming. What do they tell us? [Continue reading…]

Facebooktwittermail

Spontaneity is at the heart of science

Henry Cowles writes: There is a theory in psychology called the theory theory. It’s a theory about theories. While this might sound obvious, the theory theory leads to counterintuitive conclusions. A quarter-century ago, psychologists began to point out important links between the development of scientific theories and how everyday thinking, including children’s thinking, works. According to theory theorists, a child learns by constructing a theory of the world and testing it against experience. In this sense, children are little scientists – they hypothesise on the basis of observations, test their hypotheses experimentally, and then revise their views in light of the evidence they gather.

According to Alison Gopnik, a theory theorist at the University of California, Berkeley, the analogy works both ways. It’s not just that ‘children are little scientists’, she wrote in her paper ‘The Scientist as Child’ (1996), ‘but that scientists are big children.’ Depending on where you look, you can see the scientific method in a child, or spot the inner child in a scientist. Either way, the theory theory makes it easy to see connections between elementary learning and scientific theorising.

This should be pretty surprising. After all, scientists go through a lot of training in order to think the way they do. Their results are exact; their methods exacting. Most of us share the sense that scientific thinking is difficult, even for scientists. This perceived difficulty has bolstered (at least until recently) the collective respect for scientific expertise on which the support of cutting-edge research depends. It’s also what gives the theory theory its powerful punch. If science is so hard, how can children – and, some theory theorists argue, even infants – think like scientists in any meaningful sense? Indeed, in the age of what Erik M. Conway and Naomi Oreskes call “the merchants of doubt” (not to say in the age of Trump), isn’t it dangerous to suggest that science is a matter of child’s play?

To gain purchase on this question, let’s take a step back. Claims that children are scientists rest on a certain idea about what science is. For theory theorists – and for many of the rest of us – science is about producing theories. How we do that is often represented as a short list of steps, such as ‘observe’, ‘hypothesise’, and ‘test’, steps that have been emblazoned on posters and recited in debates for the past century. But where did this idea that science is a set of steps – a method – come from? As it turns out, we don’t need to go back to Isaac Newton or the Scientific Revolution to find the history of ‘the scientific method’ in this sense. The image of science that most of us hold, even most scientists, comes from a surprising place: modern child psychology. The scientific method as we know it today comes from psychological studies of children only a century ago. [Continue reading…]

Facebooktwittermail

Walking improves creativity

Olivia Goldhill writes: For centuries, great thinkers have instinctively stepped out the door and begun walking, or at the very least pacing, when they needed to boost creativity. Charles Dickens routinely walked for 30 miles a day, while the philosopher Friedrich Nietzsche declared, “All truly great thoughts are conceived while walking.”

But in recent years, as lives have become increasingly sedentary, the idea has been put to the test. The precise physiology is unknown, but professors and therapists are turning what was once an unquestioned instinct into a certainty: Walking influences our thinking, and somehow improves creativity.

Last year, researchers at Stanford found that people perform better on creative divergent thinking tests during and immediately after walking. The effect was similar regardless of whether participants took a stroll inside or stayed inside, walking on a treadmill and staring at a wall. The act of walking itself, rather than the sights encountered on a saunter, was key to improving creativity, they found. [Continue reading…]

Facebooktwittermail

The range of the mind’s eye is restricted by the skill of the hand

structure12

Jonathan Waldman writes: Sometime in 1882, a skinny, dark-haired, 11-year-old boy named Harry Brearley entered a steelworks for the first time. A shy kid — he was scared of the dark, and a picky eater — he was also curious, and the industrial revolution in Sheffield, England, offered much in the way of amusements. He enjoyed wandering around town — he later called himself a Sheffield Street Arab — watching road builders, bricklayers, painters, coal deliverers, butchers, and grinders. He was drawn especially to workshops; if he couldn’t see in a shop window, he would knock on the door and offer to run an errand for the privilege of watching whatever work was going on inside. Factories were even more appealing, and he had learned to gain access by delivering, or pretending to deliver, lunch or dinner to an employee. Once inside, he must have reveled, for not until the day’s end did he emerge, all grimy and gray but for his blue eyes. Inside the steelworks, the action compelled him so much that he spent hours sitting inconspicuously on great piles of coal, breathing through his mouth, watching brawny men shoveling fuel into furnaces, hammering white-hot ingots of iron.

There was one operation in particular that young Harry liked: a toughness test performed by the blacksmith. After melting and pouring a molten mixture from a crucible, the blacksmith would cast a bar or two of that alloy, and after it cooled, he would cut notches in the ends of those bars. Then he’d put the bars in a vise, and hammer away at them.

The effort required to break the metal bars, as interpreted through the blacksmith’s muscles, could vary by an order of magnitude, but the result of the test was expressed qualitatively. The metal was pronounced on the spot either rotten or darned good stuff. The latter was simply called D.G.S. The aim of the men at that steelworks, and every other, was to produce D.G.S., and Harry took that to heart.

In this way, young Harry became familiar with steelmaking long before he formally taught himself as much as there was to know about the practice. It was the beginning of a life devoted to steel, without the distractions of hobbies, vacations, or church. It was the origin of a career in which Brearley wrote eight books on metals, five of which contain the word steel in the title; in which he could argue about steelmaking — but not politics — all night; and in which the love and devotion he bestowed upon inanimate metals exceeded that which he bestowed upon his parents or wife or son. Steel was Harry’s true love. It would lead, eventually, to the discovery of stainless steel.

Harry Brearley was born on Feb. 18, 1871, and grew up poor, in a small, cramped house on Marcus Street, in Ramsden’s Yard, on a hill in Sheffield. The city was the world capital of steelmaking; by 1850 Sheffield steelmakers produced half of all the steel in Europe, and 90 percent of the steel in England. By 1860, no fewer than 178 edge tool and saw makers were registered in Sheffield. In the first half of the 19th century, as Sheffield rose to prominence, the population of the city grew fivefold, and its filth grew proportionally. A saying at the time, that “where there’s muck there’s money,” legitimized the grime, reek, and dust of industrial Sheffield, but Harry recognized later that it was a misfortune to be from there, for nobody had much ambition. [Continue reading…]

Facebooktwittermail

Exploding the myth of the scientific vs artistic mind

By David Pearson, Anglia Ruskin University

It’s a stereotype, but many of us have made the assumption that scientists are a bit rigid and less artistic than others. Artists, on the other hand, are often seen as being less rational than the rest of us. Sometimes described as the left side of the brain versus the right side – or simply logical thinking versus artistic creativity – the two are often seen as polar opposites.

Neuroscience has already shown that everyone uses both sides of the brain when performing any task. And while certain patterns of brain activity have sometimes been linked to artistic or logical thinking, it doesn’t really explain who is good at what – and why. That’s because the exact interplay of nature and nurture is notoriously difficult to tease out. But if we put the brain aside for a while and just focus on documented ability, is there any evidence to support the logic versus art stereotype?

Psychological research has approached this question by distinguishing between two styles of thinking: convergent and divergent. The emphasis in convergent thinking is on analytical and deductive reasoning, such as that measured in IQ tests. Divergent thinking, however, is more spontaneous and free-flowing. It focuses on novelty and is measured by tasks requiring us to generate multiple solutions for a problem. An example may be thinking of new, innovative uses for familiar objects.

Studies conducted during the 1960s suggested that convergent thinkers were more likely to be good at science subjects at school. Divergent thinking was shown to be more common in the arts and humanities.

However, we are increasingly learning that convergent and divergent thinking styles need not be mutually exclusive. In 2011, researchers assessed 116 final-year UK arts and science undergraduates on measures of convergent and divergent thinking and creative problem solving. The study found no difference in ability between the arts and science groups on any of these measures. Another study reported no significant difference in measures of divergent thinking between arts, natural science and social science undergraduates. Both arts and natural sciences students, however, rated themselves as being more creative than social sciences students did.

Continue reading

Facebooktwittermail

Brain scans reveal how LSD affects consciousness

Researchers from Imperial College London, working with the Beckley Foundation, have for the first time visualised the effects of LSD on the brain: In a series of experiments, scientists have gained a glimpse into how the psychedelic compound affects brain activity. The team administered LSD (Lysergic acid diethylamide) to 20 healthy volunteers in a specialist research centre and used various leading-edge and complementary brain scanning techniques to visualise how LSD alters the way the brain works.

The findings, published in Proceedings of the National Academy of Sciences (PNAS), reveal what happens in the brain when people experience the complex visual hallucinations that are often associated with LSD state. They also shed light on the brain changes that underlie the profound altered state of consciousness the drug can produce.

A major finding of the research is the discovery of what happens in the brain when people experience complex dreamlike hallucinations under LSD. Under normal conditions, information from our eyes is processed in a part of the brain at the back of the head called the visual cortex. However, when the volunteers took LSD, many additional brain areas – not just the visual cortex – contributed to visual processing.

Dr Robin Carhart-Harris, from the Department of Medicine at Imperial, who led the research, explained: “We observed brain changes under LSD that suggested our volunteers were ‘seeing with their eyes shut’ – albeit they were seeing things from their imagination rather than from the outside world. We saw that many more areas of the brain than normal were contributing to visual processing under LSD – even though the volunteers’ eyes were closed. Furthermore, the size of this effect correlated with volunteers’ ratings of complex, dreamlike visions.”

The study also revealed what happens in the brain when people report a fundamental change in the quality of their consciousness under LSD.

Dr Carhart-Harris explained: “Normally our brain consists of independent networks that perform separate specialised functions, such as vision, movement and hearing – as well as more complex things like attention. However, under LSD the separateness of these networks breaks down and instead you see a more integrated or unified brain.

“Our results suggest that this effect underlies the profound altered state of consciousness that people often describe during an LSD experience. It is also related to what people sometimes call ‘ego-dissolution’, which means the normal sense of self is broken down and replaced by a sense of reconnection with themselves, others and the natural world. This experience is sometimes framed in a religious or spiritual way – and seems to be associated with improvements in well-being after the drug’s effects have subsided.” [Continue reading…]

Amanda Feilding, executive director of the Beckley Foundation, in an address she will deliver to the Royal Society tomorrow, says: I think Albert Hoffman would have been delighted to have his “Problem child” celebrated at the Royal Society, as in his long lifetime the academic establishment never recognised his great contribution. But for the taboo surrounding this field, he would surely have won the Nobel Prize. That was the beginning of the modern psychedelic age, which has fundamentally changed society.

After the discovery of the effects of LSD, there was a burst of excitement in the medical and therapeutic worlds – over 1000 experimental and clinical studies were undertaken. Then, in the early 60s, LSD escaped from the labs and began to spread into the world at large. Fuelled by its transformational insights, a cultural evolution took place, whose effects are still felt today. It sparked a wave of interest in Eastern mysticism, healthy living, nurturing the environment, individual freedoms and new music and art among many other changes. Then the establishment panicked and turned to prohibition, partly motivated by American youth becoming disenchanted with fighting a war in far-off Vietnam.

Aghast at the global devastation caused by the war on drugs, I set up the Beckley Foundation in 1998. With the advent of brain imaging technology, I realised that one could correlate the subjective experience of altered states of consciousness, brought about by psychedelic substances, with empirical findings. I realised that only through the very best science investigating how psychedelics work in the brain could one overcome the misplaced taboo which had transformed them from the food of the gods to the work of the devil. [Continue reading…]

Just to be clear, as valuable as this research is, it is an exercise in map-making. The map should never be confused with the territory.

Facebooktwittermail

Crawick Multiverse: A former opencast coal mine transformed into a cosmic landscape

Crawick Multiverse

Philip Ball writes: When work began in 2012, the excavations unearthed thousands of boulders half-buried in the ground. [Charles] Jencks used them to create a panorama of standing stones and sculpted tumuli, organised to frame the horizon and the Sun’s movements.

“One theory of pre-history is that stone circles frame the far hills and key points, and while I wanted to capture today’s cosmology, not yesterday’s, I was aware of this long landscape tradition,” Jencks says.

The landscape also explores the idea that our Universe is just one of many.

Over the last decade or so, the argument for a plurality of universes has moved from fringe speculation to seriously entertained possibility. One leading multiverse theory supposes that other universes are continually being spawned in an ongoing process of “eternal inflation” – the same that caused our own Universe’s Big Bang 13.7 billion years ago.

These are the theories explored on this Scottish hillside. [Continue reading…]

Facebooktwittermail

The human mind as the preeminent scientific instrument

Walter Isaacson writes: This month marks the 100th anniversary of the General Theory of Relativity, the most beautiful theory in the history of science, and in its honor we should take a moment to celebrate the visualized “thought experiments” that were the navigation lights guiding Albert Einstein to his brilliant creation. Einstein relished what he called Gedankenexperimente, ideas that he twirled around in his head rather than in a lab. That’s what teachers call daydreaming, but if you’re Einstein you get to call them Gedankenexperimente.

As these thought experiments remind us, creativity is based on imagination. If we hope to inspire kids to love science, we need to do more than drill them in math and memorized formulas. We should stimulate their minds’ eyes as well. Even let them daydream.

Einstein’s first great thought experiment came when he was about 16. He had run away from his school in Germany, which he hated because it emphasized rote learning rather than visual imagination, and enrolled in a Swiss village school based on the educational philosophy of Johann Heinrich Pestalozzi, who believed in encouraging students to visualize concepts. While there, Einstein tried to picture what it would be like to travel so fast that you caught up with a light beam. If he rode alongside it, he later wrote, “I should observe such a beam of light as an electromagnetic field at rest.” In other words, the wave would seem stationary. But this was not possible according to Maxwell’s equations, which describe the motion and oscillation of electromagnetic fields.

The conflict between his thought experiment and Maxwell’s equations caused Einstein “psychic tension,” he later recalled, and he wandered around nervously, his palms sweating. Some of us can recall what made our palms sweaty as teenagers, and those thoughts didn’t involve Maxwell’s equations. But that’s because we were probably performing less elevated thought experiments. [Continue reading…]

Facebooktwittermail

Humans are natural polymaths, at our best when we turn our minds to many things

Robert Twigger writes: I travelled with Bedouin in the Western Desert of Egypt. When we got a puncture, they used tape and an old inner tube to suck air from three tyres to inflate a fourth. It was the cook who suggested the idea; maybe he was used to making food designed for a few go further. Far from expressing shame at having no pump, they told me that carrying too many tools is the sign of a weak man; it makes him lazy. The real master has no tools at all, only a limitless capacity to improvise with what is to hand. The more fields of knowledge you cover, the greater your resources for improvisation.

We hear the descriptive words psychopath and sociopath all the time, but here’s a new one: monopath. It means a person with a narrow mind, a one-track brain, a bore, a super-specialist, an expert with no other interests — in other words, the role-model of choice in the Western world. You think I jest? In June, I was invited on the Today programme on BBC Radio 4 to say a few words on the river Nile, because I had a new book about it. The producer called me ‘Dr Twigger’ several times. I was flattered, but I also felt a sense of panic. I have never sought or held a PhD. After the third ‘Dr’, I gently put the producer right. And of course, it was fine — he didn’t especially want me to be a doctor. The culture did. My Nile book was necessarily the work of a generalist. But the radio needs credible guests. It needs an expert — otherwise why would anyone listen?

The monopathic model derives some of its credibility from its success in business. In the late 18th century, Adam Smith (himself an early polymath who wrote not only on economics but also philosophy, astronomy, literature and law) noted that the division of labour was the engine of capitalism. His famous example was the way in which pin-making could be broken down into its component parts, greatly increasing the overall efficiency of the production process. But Smith also observed that ‘mental mutilation’ followed the too-strict division of labour. Or as Alexis de Tocqueville wrote: ‘Nothing tends to materialise man, and to deprive his work of the faintest trace of mind, more than extreme division of labour.’ [Continue reading…]

Facebooktwittermail

The art of attention: John Berger at 88

Philip Maughan writes: In 1967, while working with the Swiss photographer Jean Mohr on A Fortunate Man, a book about a country GP serving a deprived community in the Forest of Dean, Gloucestershire, John Berger began to reconsider what the role of a writer should be. “He does more than treat [his patients] when they are ill,” Berger wrote of John Sassall, a man whose proximity to suffering and poverty deeply affected him (he later committed suicide). The rural doctor assumes a democratic function, in Berger’s eyes, one he describes in consciously literary terms. “He is the objective witness of their lives,” he says. “The clerk of their records.”

The next five years marked a transition in Berger’s life. By 1972, when the groundbreaking art series Ways of Seeing aired on BBC television, Berger had been living on the Continent for over a decade. He won the Booker Prize for his novel G. the same year, announcing to an astonished audience at the black-tie ceremony in London that he would divide his prize money between the Black Panther Party (he denounced Booker McConnell’s historic links with plantations and indentured labour in the Caribbean) and the funding of his next project with Mohr, A Seventh Man, recording the experiences of migrant workers across Europe.

This is the point at which, for some in England, Berger became a more distant figure. He moved from Switzerland to a remote village in the French Alps two years later. “He thinks and feels what the community incoherently knows,” Berger wrote of Sassall, the “fortunate man”. After time spent working on A Seventh Man, those words were just as applicable to the writer himself. It was Berger who had become a “clerk”, collecting stories from the voiceless and dispossessed – peasants, migrants, even animals – a self-effacing role he would continue to occupy for the next 43 years.

The life and work of John Berger represents a challenge. How best to describe the output of a writer whose bibliography, according to Wikipedia, contains ten “novels”, four “plays”, three collections of “poetry” and 33 books labelled “other”?

“A kind of vicarious autobiography and a history of our time as refracted through the prism of art,” is how the writer Geoff Dyer introduced a selection of Berger’s non-fiction in 2001, though the category doesn’t quite fit. “To separate fact and ­imagination, event and feeling, protagonist and narrator, is to stay on dry land and never put to sea,” Berger wrote in 1991 in a manifesto (of sorts) inspired by James Joyce’s Ulysses, a book he first read, in French, at the age of 14. [Continue reading…]

Facebooktwittermail

How a corporate cult captures and destroys our best graduates

George Monbiot writes: To seek enlightenment, intellectual or spiritual; to do good; to love and be loved; to create and to teach: these are the highest purposes of humankind. If there is meaning in life, it lies here.

Those who graduate from the leading universities have more opportunity than most to find such purpose. So why do so many end up in pointless and destructive jobs? Finance, management consultancy, advertising, public relations, lobbying: these and other useless occupations consume thousands of the brightest students. To take such jobs at graduation, as many will in the next few weeks, is to amputate life close to its base.

I watched it happen to my peers. People who had spent the preceding years laying out exultant visions of a better world, of the grand creative projects they planned, of adventure and discovery, were suddenly sucked into the mouths of corporations dangling money like angler fish.

At first they said they would do it for a year or two, “until I pay off my debts”. Soon afterwards they added: “and my mortgage”. Then it became, “I just want to make enough not to worry any more”. A few years later, “I’m doing it for my family”. Now, in middle age, they reply, “What, that? That was just a student fantasy.” [Continue reading…]

Facebooktwittermail

Why the singularity is greatly exaggerated

Ken Goldberg, Professor of Industrial Engineering and Operations at the University of California, interviewed by Jeanne Carstensen.

In 1968, Marvin Minsky said, “Within a generation we will have intelligent computers like HAL in the film, 2001.” What made him and other early AI proponents think machines would think like humans?

Even before Moore’s law there was the idea that computers are going to get faster and their clumsy behavior is going to get a thousand times better. It’s what Ray Kurzweil now claims. He says, “OK, we’re moving up this curve in terms of the number of neurons, number of processing units, so by this projection we’re going to be at super-human levels of intelligence.” But that’s deceptive. It’s a fallacy. Just adding more speed or neurons or processing units doesn’t mean you end up with a smarter or more capable system. What you need are new algorithms, new ways of understanding a problem. In the area of creativity, it’s not at all clear that a faster computer is going to get you there. You’re just going to come up with more bad, bland, boring things. That ability to distinguish, to filter out what’s interesting, that’s still elusive.

Today’s computers, though, can generate an awful lot of connections in split seconds.

But generating is fairly easy and testing pretty hard. In Robert Altman’s movie, The Player, they try to combine two movies to make a better one. You can imagine a computer that just takes all movie titles and tries every combination of pairs, like Reservoir Dogs meets Casablanca. I could write that program right now on my laptop and just let it run. It would instantly generate all possible combinations of movies and there will be some good ones. But recognizing them, that’s the hard part.

That’s the part you need humans for.

Right, the Tim Robbins movie exec character says, “I listen to stories and decide if they’ll make good movies or not.” The great majority of combinations won’t work, but every once in a while there’s one that is both new and interesting. In early AI it seemed like the testing was going to be easy. But we haven’t been able to figure out the filtering.

Can’t you write a creativity algorithm?

If you want to do variations on a theme, like Thomas Kinkade, sure. Take our movie machine. Let’s say there have been 10,000 movies — that’s 10,000 squared, or 100 million combinations of pairs of movies. We can build a classifier that would look at lots of pairs of successful movies and do some kind of inference on it so that it could learn what would be successful again. But it would be looking for patterns that are already existent. It wouldn’t be able to find that new thing that was totally out of left field. That’s what I think of as creativity — somebody comes up with something really new and clever. [Continue reading…]

Facebooktwittermail

How Yitang Zhang rose from obscurity and a disadvantaged youth to mathematical celebrity

Thomas Lin writes: As a boy in Shanghai, China, Yitang Zhang believed he would someday solve a great problem in mathematics. In 1964, at around the age of nine, he found a proof of the Pythagorean theorem, which describes the relationship between the lengths of the sides of any right triangle. He was 10 when he first learned about two famous number theory problems, Fermat’s last theorem and the Goldbach conjecture. While he was not yet aware of the centuries-old twin primes conjecture, he was already taken with prime numbers, often described as indivisible “atoms” that make up all other natural numbers.

But soon after, the anti-intellectual Cultural Revolution shuttered schools and sent him and his mother to the countryside to work in the fields. Because of his father’s troubles with the Communist Party, Zhang was also unable to attend high school. For 10 years, he worked as a laborer, reading books on math, history and other subjects when he could.

Not long after the revolution ended, Zhang, then 23, enrolled at Peking University and became one of China’s top math students. After completing his master’s at the age of 29, he was recruited by T. T. Moh to pursue a doctorate at Purdue University in Lafayette, Ind. But, promising though he was, after defending his dissertation in 1991 he could not find academic work as a mathematician.

In George Csicsery’s new documentary film Counting From Infinity, Zhang discusses his difficulties at Purdue and in the years that followed. He says his doctoral adviser never wrote recommendation letters for him. (Moh has written that Zhang did not ask for any.) Zhang admits that his shy, quiet demeanor didn’t help in building relationships or making himself known to the wider math community. During this initial job-hunting period, Zhang sometimes lived in his car, according to his friend Jacob Chi, music director of the Pueblo Symphony in Colorado. In 1992, Zhang began working at another friend’s Subway sandwich restaurant. For about seven years he worked odd jobs for various friends.

In 1999, at 44, Zhang caught a break. [Continue reading…]

Facebooktwittermail