By Rob Brooks
Ultimately, our ability to convincingly lie to each other may have evolved as a direct result of our cooperative nature.
Thus concludes the abstract of a new paper in the journal Proceedings of the Royal Society B that considers the evolution of “tactical deception” using a theoretic model and a comparative study of primates.
I’m interested to see how the news media handle this paper. Because the main conclusion – that lying is a way of exploiting others’ cooperative behaviour – seems awfully obvious. But I suspect the true value of today’s paper is a bit more nuanced.
Many species – most notably our own – have evolved quite extraordinary capacities to cooperate. We might take cooperation as an obvious facet of life, but long-term cooperative gain requires a willingness to put aside narrow self-interest in the short term. And that doesn’t evolve easily.
Cooperation makes it possible for some individuals to cheat, prospering off the cooperative efforts of others. Cooperate too readily and you might get taken for a ride. Cooperate only grudgingly and you don’t reap the benefits of working together.
Evolutionary biologists and economists find that even the simplest models of cooperation – such as the prisoner’s dilemma game, explained in the video below – can lead to complex rules about when an individual should cooperate and when it should try to cheat.
The prisoner’s dilemma.
Peer into the natural world, and the range of possible behavioural patterns that have evolved to fetter cheating and allow cooperation to flourish becomes even more complex.
In some species individuals reciprocate directly. Well-fed vampire bats regurgitate blood meals for starving bats that have helped them avoid starvation (also by regurgitating) in the past. Others reciprocate less directly.
A rat that has been helped by another rat, for example, is more likely to help a third individual to obtain food than is a rat that has not been helped before.
Captive vampire bats sharing food by regurgitation.
And in many animal societies, including bees, ants and naked mole rates, transgressions get punished and cooperative behaviour rewarded.
Humans do all these things too. They also share information – like gossip – and prefer to cooperate with those who have good reputations. This makes human cooperation almost infinitely complex.
Anthropologists go to great lengths to understand how reputations are earned and regulated. In one recent study a team embedded itself in a Dominican village for nearly two years and counted the number of prosocial acts each person engaged in, and how many people they helped.
The new model in the Royal Society B paper – based on the prisoner’s dilemma – suggests the evolution of cooperation led also to the evolution of lying.
Rather than simply cheating – trying to gain from another’s cooperative behaviour without behaving cooperatively yourself – this model adds another way of operating. By misleading the other individual, one can trick that individual into cooperating.
In looking for an example, I keep returning to perhaps the only memorable line from Ricky Gervais’ otherwise forgettable film The Invention of Lying: “The world’s gonna end unless we have sex right now!”.
The Invention of Lying – watch for the least forgettable line at 1.40, and the inevitable response.
In The Invention of Lying, Mark Bellison (Gervais’ character) does so well out of his novel ability to lie precisely because the cooperative nature of the lie-free society he inhabits. Lying about the world in order to cheat works well if liars don’t get too common or too brazen. If they do, the whole cooperative edifice collapses.
NcNally and Jackson back up their modelling work with an analysis of primate species in which they show the more cooperative species also have higher rates of deception. It is the cooperation itself that permits the evolution of the liar.
This paper might seem a little obvious and more than a little simplistic. It certainly does to me. But models of this nature do a great service by putting our intuitions to the test. And they can later be developed and elaborated to illuminate more difficult questions.
I would like to see if it can help us understand the fine-scale tensions between cooperation and dishonesty in human affairs. There is a lot more to lying than simply misrepresenting the world.
The liar often deceives him or herself as well – possibly in order to put a more convincing gloss on the lie.
Neuroscientist Sam Harris recently published Lying, a short e-book arguing we can both simplify our own lives and build better societies by telling the truth in situations when we might be tempted to lie.
Harris doesn’t just mean the whoppers typical of fraudsters, philanderers and politicians. He is especially concerned with the “white” lies that many of us tell in order to spare others discomfort and the corrosive effects they have on societies.
He seems to be advocating we try to build a lie-free world, such as the one in The Invention of Lying. But his suggestions go beyond a hopeful “We’d all be better off if we just told the truth. Mmmkay?”.
Harris gets bottom-up processes and the conflict between individual benefits and group functioning. His book is worth a read for his impassioned argument that each of us, as individuals, would benefit from resisting the urge to lie.
I’m not convinced. What would help right now is some theoretic and empirical evidence that showed the conditions under which Harris’ prescriptions might work. And that’s the beauty of papers like today’s one from McNally and Jackson.
Irrespective, a better understanding of how lying evolves, no matter how simple, might do enormous social good.
For one thing it might help constrain the worst dishonesties in politics, public relations and propaganda.
Rob Brooks receives funding from the Australian Research Council.
By Kenneth McNamara, University of Cambridge
Australia is famous for its natural beauty: the Great Barrier Reef, Uluru, Kakadu, the Kimberley. But what about the places almost no one goes? We asked ecologists, biologists and wildlife researchers to nominate five of Australia’s unknown wonders.
It is a testament to the size and isolation of many parts of Australia that it wasn’t until 1947 that the second largest meteorite crater in the world was discovered. Known as Wolfe Creek Crater, this imposing feature is located about 145km from Halls Creek in the Kimberley region of Western Australia. It can be reached after a two to three-hour drive down the Tanami Road, only accessible to conventional vehicles during the dry season.
Its discovery came during an aerial survey of this part of the Kimberley region, when geologists Frank Reeves and NB Sauve, along with pilot Dudley Hart, spotted an unusual circular structure almost a kilometre in diameter. Naturally intrigued by what they saw, they were keen to inspect it a little closer.
Two months later, Reeves and Hart reached the site on foot and made the first detailed investigation. Their suspicion that it was a deep crater was confirmed after they climbed up the outer sloping flanks of the structure and looked down to the floor, some 30 metres below. As they made their way up the slope of the crater rim they would have seen rusty balls of rock scattered on the ground or fused to the laterite.
Known as “shale balls”, these rusty rocks provide the evidence that the structure was a huge meteorite crater. These rust balls represent the deeply weathered remains of an iron meteorite that exploded when it collided with Earth about 300,000 years ago – clear and stark evidence of what made the crater.
The hole that it gouged out of the Devonian age quartzite rocks varies in diameter from 950 to 870 metres. The only bigger crater undoubtedly made by a meteorite impact is the Meteor Crater in Arizona.
Travelling at cosmic velocity, about 15km per second (that’s 40 times faster than a bullet from a high-powered rifle, or like crossing Australia in less than five minutes), the massive chunk of iron would have exploded on impact with the earth. Most of the meteorite, which was probably getting on for 100,000 tonnes in weight, would have been vapourised, along with huge quantities of the quartzite rock into which it ploughed.
Very little of the meteorite remains today, but sufficient has been collected for us to be sure that this was what made the crater. Like all iron meteorites, this one contained small quantities of nickel. In the weathered shale balls this nickel has been incorporated into what turned out to be a nickel-iron carbonate mineral not found anywhere else and which was new to science. This was named “reevesite” after Frank Reeves.
Although only “discovered” in 1947, the structure had long been known to the local Indigenous people, probably for thousands of years. The local Djaru people call the crater Kandimalal. In their dreamtime stories two rainbow snakes crossed the desert and in doing so formed the nearby Sturt Creek and Wolfe Creek. The crater is the place where one of the snakes emerged from the ground.
Huge quantities of sand have blown into the crater since it was formed, so it would have originally been far deeper. Its base is essentially flat, except for a slight rise in the centre. This is a feature of many meteorite craters and represents where the earth rebounded following the explosion.
There are a number of sink holes in the centre and unusually large trees grow here. These are mainly species of Acacia and Eucalyptus, some growing up to 8m high. It is likely that the trees draw on the summer water that is trapped in these sink holes. The relatively large number of dead trees interspersed with healthy ones attests to periods of lower rainfall. The sink holes are arranged on two intersecting lines and probably reflect the location of stress fractures formed by the explosion at the base of the crater.
As well as higher moisture levels, this central vegetation patch has higher soil salinity and nitrate content. One of the consequences of the higher soil moisture content has been the production of a circular, darker patch within which more vegetation grows. So when viewed from above, the crater looks remarkably like a huge eye peering up at the sky.
Wolfe Creek Crater’s future looks pretty secure. It has legislative protection in the form of Class A Reserve status in a National Park. Its isolation also affords it an added protection.
Fortunately the chances of the crater sitting on a huge resource of iron ore is remote. Virtually all would have been pulverised when this traveller from the asteroid belt made its violent contact with the Earth at a time when the only terrestrial inhabitants to have viewed the spectacle would have been a few bemused giant kangaroos and diprotodontids.
Next: Christmas Island. Read all the unknown wonders here.
I have no conflicts of interest
By Ken Cheng
Have you ever noticed that even detailed, sophisticated virtual reality experiences don’t feel completely “real”?
It all comes down to your inner ear – and a study published earlier this month using rats may help explain why this is the case.
Researchers from University of California, Los Angeles, let rats run along a virtual narrow hall and measured their brain activity, and compared these virtual-world rats with rats running along a real hall (real-world rats).
Even when the rats could move in a virtual world, their sense of space was less than fully normal, at least as far as their brain activities – namely the firing of their “place cells,” explained below – showed.
The researchers measured activity in a much-studied part of the brain known to play a crucial role in spatial cognition and memory in general: a seahorse-shaped structure called the hippocampus (named after the genus name of seahorses).
Place cells have place-specific firing properties: they fire a lot only when the animal is at a particular place in space.
The study’s authors wanted to find out if hippocampal cells with place cell properties were as abundant in virtual-world rats as in real-world rats.
Unlike a console with buttons for video-game aficionados, the researchers’ virtual rat world was far more realistic.
Their rats got to move, at least on the spot, on a big ball (see panel A above).
The ball rotated under them as they walked, so that they never get anywhere in real space.
But in the virtual world, the visual input moved as it should when a rat actually moves through the space: the projected visual world was cleverly linked to the ball movement via a computer program (panels B and C).
The virtual-world rats thus had visual cues as well a bunch of bodily cues, those stemming from its limbs, as it moved in the virtual world.
Basically, all that was missing were cues, stemming from the vestibular apparatus in the inner ear, which told the animals that they were actually accelerating (or not).
We have vestibular apparatuses in our inner ears as well, and they contain sensory hairs in fluid-filled chambers.
When a rat (as with humans) moves its head, the fluids (cupola) slosh and displace the sensory hairs, causing them to fire signals to the brain.
The pattern of firing tells us how the head is moving.
The vestibular apparatuses of virtual-world rats sensed little displacement as the rats ran on the track ball, because the rats were strapped in place.
With only the vestibular apparatus not functioning normally in the virtual world, the authors nevertheless discovered hippocampal place cells were much harder to find in those rats compared with real-world rats.
The place cells in virtual-world rats also had wider fields, meaning they were less precise in defining a place.
The sense of space, as philosopher Rene Descartes proposed in the 17th century, seems to be a multimodal sense, and it is only complete when all the senses — visual, bodily, vestibular, and probably olfactory and auditory as well — deliver their spatial information.
For rat neuroscience, it means that this beautiful and elegant virtual world has its limits in probing spatial cognition.
It would be wonderful to probe a rat’s brain as it virtually travelled its natural scale of hundreds or thousands of metres, rather than in the one- to two-metre experimental arenas typically foisted on lab rats.
But this research shows that virtual-world rats’ place cells would not “behave” as they do in the real world.
These results may explain how virtual reality scenarios, such as stationary flight simulators, don’t completely fool our brains into thinking we’re in a different world.
But what about games in head-mounted virtual reality?
The head-mounted system projects a virtual scene to our visual system via goggles, and as we actually move — hopefully in a huge space without obstructions — the visual world changes accordingly.
With appropriate smells and sounds added, this would get all our senses orchestrated.
It’s quite the challenge for developers, but ah – such is the stuff as dreams are made on.
Ken Cheng receives funding from the Australian Research Council.
By Adam Best, CSIRO
Imagine having a wafer-thin touchscreen on your sleeve which, like a scene out of a Philip K. Dick novel, gives you all the functionality of a smartphone without the awkwardness of a cumbersome battery.
The best part about this scenario is it may not be as far from reality as you think.
The bulky packaging of batteries limits innovation of some of the amazing new, ultra-slim electronics today.
If you open up an iPhone 5, you’ll see that a large proportion of the phone’s volume is taken up by the battery.
Piezo is a term derived from a Greek word meaning to squeeze or press, and piezoelectric materials generate electricity when they are pressured or twisted – enabling development of shoes, clothing or recreational gear that generate electricity from movement.
You could have a jacket that’s a wearable mobile phone, with a flexible electronic screen printed on the cuffs, flexible phone circuit boards woven into the fabric, and a microphone in the collar. Just input the number on your cuff, and start talking.
A flexible Nokia prototype.
There would be no risk of running out of power mid-conversation – you’d just move your arms about to charge the batteries. The jacket would be washable – but you’d need another “phone jacket” on the day it was in the wash.
In a similar advance, a cocktail dress wired for Bluetooth that lights up when the wearer gets an incoming call won a design competition for a London College of Fashion student. It’s hard to ignore that kind of phone call.
The same technology can power smart sports clothing, so that your jogging suit or cycling shorts could carry a global positioning system (GPS) unit and a fitness monitor powered by your movements as you pedal or run.
You could monitor and analyse your workout before you hit the shower. Australian Football League players already optimise their playing performance using GPS trackers carried between their shoulder blades.
In leisure applications, flexible batteries have multiple applications. Incorporating a flexible battery into a backpack means that you could charge your phone, MP3 player, or GPS unit from your backpack after a day’s hike.
Tent panels that incorporate flexible batteries could capture wind energy and use it to charge a personal computer, GPS or personal locator beacon, such as an Emergency Position Indicating Radiobeacon (EPIRB) – making searches for skiers and hikers lost in the bush easier.
The same technology is being used to capture and store energy to power military electronic equipment, so that soldiers in the field don’t have to carry extra weight of batteries.
On a larger scale, buildings made with fabrics which capture wind energy are proposed as a renewable power source.
Footpaths which capture energy from footsteps are on the market – there is even a dance floor which powers the air conditioning for an eco-nightclub using kinetic energy (energy of motion) from dancing patrons.
In medical care, wearable electronics provide an opportunity for revolutionary advances. Wearable electronic shirts capable of powering automated injection devices could make life easier for patients who need regular injections of insulin or other drugs.
A shirt incorporating a heart rate monitor could sense changes in heart rhythm in elderly or cardiac-impaired patients – and summon emergency services if patient health is severely compromised by a change in heart rate.
Best of all, wearable electronics provide non-intrusive monitoring – enabling patients to pursue normal daily life activities under medical care.
But for those of us for whom batteries are a constant item on the shopping list, perhaps the biggest change that wearable electronics offers is freedom from needing to buy spare batteries.
Now that will make a difference to my personal retail experience.
Adam Best has received funds from the Australian Defence Department’s Capability and technology Demonstrator (CTD) program and from CSIRO’s National Research Flagships Energy Transformed and Future Manufacturing.
By Andrew Whitehouse, University of Western Australia
One of the most enjoyable aspects of my job is the talks that I am asked to give to graduating university classes and at awards ceremonies.
Below is a talk in which I ask the question whether a science career is worth it. The transcript is provided below the video.
Is a scientific career worth it?
Thank you for inviting me today and congratulations to all the award recipients.
The invite for today asked me to reflect on the highs of my career.
The problem, I’m afraid, is that I don’t have too much on which to reflect. It would be like asking a 400 metre runner on the first bend to reflect on the decent burst he made at the 50 metre mark. Or perhaps asking a trapeze artist this same question at the point where they’ve climbed up the long ladder and just gripped the swing.
Plus, is there anything worse than a bespectacled scientist standing up and flexing his nerd-tacular muscles – surely this would send even my mother’s eyelids droopy, and the rest of you into an unrousable coma.
Instead, I thought that today I would talk about you. I thought I’d talk about your future and what lays ahead of you. I wanted to look at your next 30, 40, even 50 years, and ponder your path. Is a scientific career something of value? Is a scientific career good for you? Is a scientific career something that I would recommend to you? In essence, is a scientific career worth it?
I’ll cut out the jibber-jabber and jump straight to the answer.
The answer is yes. Simply, yes.
Most people would agree that science is a noble pursuit. But, then again, most people don’t really have full knowledge about what a scientific career entails.
I say ‘yes’, it is worth it’ not just because I was brought here and was served a free brekkie – though I am very grateful for it. I say ‘yes’ because after close to a decade of being in full-time research, I hand-on-heart believe this.
So, where’s the problem? Science is a worthy career, and I encourage you to pursue it. That’s my message, thanks for the free brekkie. Bada-bing Bada-boom.
But, of course, things aren’t that simple.
You see, the problem – and you will discover this sooner rather than later – is that a scientific career is far from easy, and the obstacles are almost certainly greater than you expect.
Let’s just take my path as a full-time scientific researcher as an example.
In this profession, you’re unlikely to ever have complete job security.
You will be forced to engage in a fierce competition with exceptionally talented people to obtain the few jobs that are available.
You will be set extraordinarily high benchmarks, and then appraised against these to within an inch of your life by people who often know nothing more about you than what is written on a piece of paper.
At times, you will pour your time, energy, heart and soul into an idea, an experiment, a paper – and make painful choices about the time you spend with your family and friends in order to see these through.
At times, these ideas, experiments and papers will tumble down, either in a ball of flames, or even worse, without so much as a yelp.
At times, you’ll see people smiling and waving at you on your way down.
An image I often picture is of a tree full of scientists, each hanging on to a branch with all of their might. There is a force down below that has gripped the trunk and is shaking it hard and often. In a scientific career, it is all too easy to lose your grip and be shaken off.
Sure, to a certain degree, these are the same challenges that you’ll encounter in many careers. But the obstacles are so frequent and incessant, that I think science has a specially-reserved place in the pantheon of unwise career choices.
I don’t say this for sympathy, nor for you to start slow clapping as you raise me on your shoulders and carry me triumphantly out of the room.
I say it simply to remind you just how much you have to want this.
I started my working life as a Speech Pathologist, and had fully intended to pursue a career as a clinician. The only thing standing in my way was a crippling lack of talent in this area.
My first clinical job was helping children with autism to help improve communication and social skills. It was enormously fulfilling work. However, the problem was that I’m sure that I got more out of the work than the families did – which is certainly not the right way around. My ‘aha’ moment came when a young boy, all of 7 or 8 years of age, didn’t take too kindly to my clumsy attempts at therapy, and grabbed my glasses and snapped them clean in half. The rest of the session was done entirely by feel.
But I had enormous empathy for these families and I was still desperate to help. Life after a diagnosis of autism can be extremely tough. The emotional, physical and financial toll on families is difficult to fathom unless you’ve been through it. I was working with extraordinary children, from extraordinary families, in extraordinary circumstances. I wanted to help. And that’s where scientific research came in.
I had always been attracted to science. To me, science not only enables us to observe the myriad wonders of our world, it also allows us to make sense of it. We can take those questions that we may have thrown around at the pub, make predictions based on our thoughts, and test these for their accuracy. Science doesn’t take away from the beauty and mystery of the world, it enhances the wonder by providing answers and offering even more questions.
For my situation, science provided a way that I may be able to help families touched by autism. By not only getting to the bottom of what may lead to autism, but by also devising new therapies, we may be able help people with autism live the most fulfilling life possible. This, I felt, was a worthy goal to which I could contribute.
So, with those thoughts firmly in my mind, I embarked on a fairly typical scientific path that took in the sites of a PhD, a Postdoc, a Research Fellowship, and now, a Professorship.
I encountered all of the difficulties experienced by many in this line of work: the ever-present threat of not having a salary; the soul-sapping competition for finite research funding, the cutting daily appraisals by people unknown to me; the pressure that comes with feeling that standing still is actually moving backwards; and, of course, knockback after knockback.
I succeeded in some of these challenges, and failed without question in others. I’ve had times of unrestrained joy, and times of aching sorrow. Blissful contentment and fist-clenching madness. Exhilaration. Frustration.
But – and let me be clear about this again – a scientific career is absolutely worth it.
The trick is to find your way through the mire, while not just holding on to that branch of the scientific tree, but also by resolutely holding onto who you are as a person.
I want to offer to you today that achieving the first of these without the second, is actually not success at all. Conquering your scientific goals and becoming famous amongst your peers (hell, even remaining in a scientific job), is worthless if you lose your own essence in the process.
I am not here presenting myself as the wise, old elephant that dispenses advice. I certainly don’t meet these criteria – though, at 31, some would argue that I’m encroaching on one of these.
The point that I wish to make is that the road ahead for you is tough, and the obstacles you encounter make it all too easy to focus all of your energies on surviving, and not nearly enough on thriving. Life without thriving, is barely life at all. This is not what I want for you; for me; for any scientist.
What I first encourage you to do is to have a clear understanding of why you want a career in science. Sit down, think hard, scribble notes. Keep throwing around ideas until you land on the exact reason why this career is for you.
For me, it was to find better ways to help people. For you, it may be curiosity – to understand the geology of Mars, or to fathom the complex dances of the African Honey Bee. It may be adventure – to travel to wonderful places, meet fascinating people, and work out how to answer the unsolved mysteries of our world. It may simply be because you like the idea of being able to wear shorts and thongs to work.
It can be for whatever reason that works for you. But whatever motivation you find, it has to have a strong enough grip to hold onto that shaking tree with one hand, and to hold onto you with the other
The second piece of encouragement is for you to find a way to remind yourself that your scientific career is only one part of your life.
Many strange words and phrases will be thrown at you over your career: H index, impact factors, citation numbers. All of these once had a noble purpose, but now do nothing more than distract scientists and judge them unfairly.
They also lead to you think that life is about how well you stack up against other people.
But– and I hope that this isn’t too big a piece of news for you – this isn’t even close to what life is about. The love of your partner, your children, your parents, your friends are not dependent on how high your H-index climbs. The telling truth is that your own love for you, should not hinge on this either.
By all means, aspire for the extraordinary. Nobody wants to stand in the way of that. But, while you’re doing this, always remember to cherish the ordinary. Smile at a sunset, listen for the birds, savour your first coffee of the day. Those say more to me, and to the vast majority of people in this world, than an H-index of 60.
Is a career in science worth it? Absolutely. Is it worth making the centre of your life? Absolutely not. No career is.
Every day, every moment, you have a choice. Do I choose to bog myself in the quagmire of pessimism, or do I choose to focus on the astonishing beauty and wonder of our lives.
In this career, I’ve met some of the most wonderful people you could ever hope to meet. I’ve also encountered some of the most challenging. Who do I choose to remember?
I have received jaw-droppingly nice words from scientific peers, and I have also received comments that I can scarcely believe another human could write to another. What do I choose to remember?
I have run into brickwall after brick wall at work, and then got home to receive a kiss from my partner and a lick from my dog. What do I choose to remember?
This is not trite. This is life. And in your scientific careers, you will have this same choice over and over again. The thing to remember – and remember this you must – is that nobody will be watching your decisions except you.
I wish you every success as you embark on your exciting journey. Science is a wonderful career that can fulfil you like I believe few professions can.
Seek out knowledge; embrace experiences; and enjoy the company you make. And whatever you do, hold on and hold on tight. Because success in a science career – like success in life – comes from those who can hold on the longest.
Click here if you would like to be on the mailing list for this column.
Andrew Whitehouse does not work for, consult to, own shares in or receive funding from any company or organisation that would benefit from this article, and has no relevant affiliations.
By Sunanda Creagh, The Conversation
Commander Hadfield, an avid Twitter-user who recently released a video of himself singing the David Bowie classic Space Oddity, joined Soyuz Commander Roman Romanenko of the Russian Federal Space Agency(Roscosmos) and NASA Flight Engineer Tom Marshburn on the long trip home from the station.
The astronauts prepare for their trip home.
“Russian recovery teams were on hand to help the crew exit the Soyuz vehicle and adjust to gravity after 146 days in space,” NASA said in a statement on its website.
“The undocking marked the end of Expedition 35 and the start of Expedition 36 under the command of Russian cosmonaut Pavel Vinogradov, who is scheduled to remain on the station with Flight Engineers Chris Cassidy and Alexander Misurkin until September.”
NASA said Hadfield, Marshburn and Romanenko spent their last morning on the station packing, including hardware from an experiment examining how gases and liquids come together and separate in space.
“Results from this experiment may lead to improvements in the shelf-life of household products, food and medicine,” NASA said.
Kevin Orrman-Rossiter, Senior Research Services Officer at the University of Melbourne’s Faculty of Science, said the International Space Station provides humans with a place to live in space.
“Until we get past a certain point that commercial interests take over and it becomes the ‘next’ exotic holiday, space exploration still requires substantial effort. In this case a multinational effort, demonstrating that people can work across political and national boundaries,” he said.
“For me, an interesting outcome from this last expedition was the difference one person made. Commander Chris Hadfield connected widely with people across the world using Twitter and fantastic pictures of our world from a view we can not normally see.”
Alice Gorman, Lecturer in Archaeology at Flinders University and an expert on space junk said the the International Space Station has many critics who see it as a white elephant.
“But I think it’s quite important for a couple of reasons – firstly because it’s an international cooperative venture, and this is an important antidote to the strong current of national security and military interests in space,” she said.
“As for the other reason, think about what it would be like if there were no humans in space at all. I think we’d feel a little closed in, like we had poked our head out of the atmosphere for a look and then scurried back into the atmosphere before anything scary happened.”
While robotic missions are far more efficient than crewed missions, the space station was about more than scientific efficiency, she said.
“It’s about making space a human place, and I think that’s important.”
Jonti Horner, Post Doctoral Research Fellow at University of New South Wales and an expert on exoplanets, said that the most important thing about the International Space Station is simply that it’s there.
“When we went to the Moon, in the late 60s and early 70s, it was viewed as being the first step towards a future where humanity was spreading out to the stars (manned bases on the moon, and Mars).
Since the last man walked on the moon, though, 40 years have passed – and since then, men and women have never left relatively low Earth orbit,“ he said, adding that many important experiments had been done on the station, taking advantage of its microgravity conditions.
“It’s kind of important to have some manned presence up there, I feel – if nothing else, just to show we haven’t totally abandoned the idea of getting off our planet.”
By Amy Reichelt
Neurological disorders can have a devastating impact on the lives of sufferers and their families.
Drug treatments are often ineffective in these disorders. But what if there was a way to simply switch off a devastating tremor, or boost a fading memory?
Recent advances using Deep Brain Stimulation (DBS) in selective brain regions have provided therapeutic benefits and have allowed those affected by these neurological disorders freedom from their symptoms, in absence of an existing cure.
Artificial cardiac pacemakers are typically associated with controlling and resynchronising heartbeats by electrical stimulation of the heart muscle.
In a similar manner, DBS sends electrical impulses to specific parts of the brain that control discrete functions. This stimulation evokes control over the neural activity within these regions.
Prior to switching on the electrical stimulation, electrodes are surgically implanted within precise brain regions to control a specific function.
The neurosurgery is conducted under local anaesthetic to maintain consciousness in the patient. This ensures that the electrode does not damage critical brain regions.
The brain itself has no pain receptors so does not require anaesthetic.
Following recovery from surgery the electrodes are activated and the current calibrated by a neurologist to determine the optimal stimulation parameters.
The patient can then control whether the electrodes are on or off by a remote battery-powered device.
Deep Brain Stimulation surgery.
Perhaps the most documented success of DBS is in the control of tremors and motor coordination in Parkinson’s disease.
Deterioration of these neurons reduces the amount of dopamine available to be released in a brain area involved in movement, the basal ganglia.
The administration of L-DOPA temporarily reduces the motor symptoms by increasing dopamine concentrations in the brain. However, side effects of this treatment include nausea and disordered movement.
DBS has been shown to provide relief from the motoric symptoms of Parkinson’s disease and essential tremors.
For the treatment of Parkinson’s disease electrodes are implanted into regions of the basal ganglia – the subthalamic nucleus or globus pallidus, to restore control of movement.
These are regions innervated by the deteriorating substantia nigra, therefore the DBS boosts stimulation to these areas.
Patients can then switch on the electrodes, stimulating these brain regions to enhance control of movement and diminish tremors.
Recently, DBS has been used to diminish memory deficits associated with Alzheimer’s disease, a progressive and terminal form of dementia.
The pathologies associated with Alzheimer’s disease involve the formation of amyloid plaques and neurofibrillary tangles within the brain leading to dysfunction and death of neurons.
Recent clinical trials with DBS involve the implantation of electrodes within the fornix – a structure connecting the left and right hippocampi together.
By stimulating neural activity within the hippocampi via the fornix, memory deficits associated with Alzheimer’s disease can be improved, enhancing the daily functioning of patients and slowing the progression of cognitive decline.
Another use of DBS is in the treatment of substance abuse and drug addiction. Substance-related addictions constitute the most frequently occurring psychiatric disease category and patients are prone to relapse following rehabilitative treatment.
Persistent drug use leads to long term changes in the brain’s reward system.
Understanding of the reward systems affected in addiction has created a range of treatment options that directly target dysregulated brain circuits in order to normalise functionality.
One of the key reward regions in the brain is the nucleus accumbens and this has been used as a DBS target to control addiction.
Translational animal research has indicated that stimulation of the nucleus accumbens decreases drug seeking in models of addiction. Clinical studies have shown improved abstinence in both heroin addicts and alcoholics.
Studies have extended the use of DBS to potentially restore control of maladaptive eating behaviours such as compulsive binge eating.
In a recent study, binge eating of a high fat food in mice was decreased by DBS of the nucleus accumbens. This is the first study demonstrating that DBS can control maladaptive eating behaviours and may be a potential therapeutic tool in obesity.
Despite its therapeutic use for more than a decade, the neural mechanism of DBS is still not yet fully understood.
The remedial effect is proposed to involve modulation of the dopamine system – and this seems particularly relevant in the context of Parkinson’s disease and addiction.
DBS potentially has effects on the functional activity of other interconnected brain systems. While it can provide therapeutic relief from symptoms of neurological diseases, it does not treat the underlying pathology.
But it provides both effective and rapid intervention from the effects of debilitating illnesses, restoring activity in deteriorating brain regions and aids understanding of the brain circuits involved in these disorders.
Amy Reichelt does not work for, consult to, own shares in or receive funding from any company or organisation that would benefit from this article, and has no relevant affiliations.
By Stephen Wroe
It was a strange and often hostile place – at times much drier and as much as nine degrees cooler than now – with a sometimes vast arid core that expanded to encompass 70% or more of the continent. And it was dominated by giants.
This “megafauna” included the largest marsupial that ever lived, Diprotodon, the size of a large rhinoceros; huge, short-faced kangaroos that exceeded 200kg in body mass; and massively-built terrestrial birds, around the height of an emu – but twice as heavy. They were preyed upon by a venomous goanna that may have been as big as a large saltwater crocodile, and a bizarre-but-deadly marsupial “lion” with incredibly powerful jaws and bolt-cutting teeth.
Not all were gigantic in any strict sense – some were simply much larger relatives of existing species; for example, there was an echidna the size of a large dog. Others were much larger “versions” of species still alive today, such as the giant grey kangaroo. All up, around 90 of these large to gigantic species and subspecies existed.
Now they are gone; only a few big kangaroos still survive.
Explaining these extinctions has locked scientists in heated debate since the 19th Century. While arguments have changed, the identity of the proposed “culprits” have not. Was it climate, or was it humans?
Historically, there have been times when some researchers have claimed victory and the ascendancy of one interpretation or another, but such claims have typically been short-lived. The data have been scarce: there are too few reliable dates on either humans and their artefacts, or extinct megafauna, and a very limited understanding of environmental change over the vast tracts of time in question.
In recent years this has begun to change. Humans arrived sometime around 50-45 thousand years ago, but it is increasingly clear that many or most of the megafauna had disappeared before humans arrived. Times of peak cold are known as glacial maxima (times of peak cold and aridity), but of the 90 or so extinct species of megafauna around 50 are not known from fossil deposits younger than the Penultimate Glacial Maximum (approximately 130 thousand years ago). Other species disappeared about 50 thousand years later, but still long before the arrival of the first Aborigines.
At most 14 and as few as eight species of now extinct megafauna clearly overlapped in time with humans. At localised levels, too, there is mounting evidence from specific sites that a staggered, stepwise extinction was well established long before humans made an appearance. There has never been any direct evidence of humans preying on any now extinct megafauna anywhere in Sahul – or even evidence of a tool-kit typical of big-game hunting hunter-gatherers.
Across geological time the vast majority of species that have ever lived have gone extinct, and the vast majority of these in the complete absence of humans. Climate or climate-related influences are undoubtedly to blame in almost every instance.
So how did human-driven explanations gain support in Sahul?
Underpinning all arguments for a human-driven process were two key assumptions. The first is that the megafauna were present when Aboriginals arrived; the second is that all previous glacial maxima – the last peaked between 28-19 thousand years ago – were much of a muchness, or at least that there was nothing remarkable or extreme about the last two or three. The reasoning was that because we “knew” that the megafauna were here and that there was nothing particularly unusual about the last few glacial cycles, the only feasible cause was the arrival and subsequent activity of people.
As we have seen, it is now clear that the first of these assumptions was poorly-founded at best. The evidence suggests that few of the megafauna were here when humans arrived.
Just as importantly, it is also now clear that the second assumption was likewise incorrect. In fact many palaeoclimatologists have long been of the opinion that Sahul was subject to a protracted, stepwise deterioration in climate over the last 300-400 thousand years. The long term trend is an increasingly arid and erratic climate.
In recent years the evidence for the protracted, stepwise aridification of Sahul has firmed, backed by new and mounting data from Antarctic ice cores and analyses of ancient central Australian lake levels. The 800-thousand-year Antarctic ice core record in particular has provided unprecedented resolution on the Southern Hemisphere story – and it has revealed a distinct change from 450 thousand years ago if not earlier.
From this time on things started to become more extreme. Moreover, the ice core record shows marked drying, beginning at around 50-45 thousand years ago – the time humans arrived. This is consistent with evidence for the decline of once vast inland mega-lakes. Other recent studies have suggested that climatic deterioration may have taken place to varying degrees across the planet – beginning as early as 700 thousand years ago.
Still further cracks have emerged in arguments for a human role. Spikes in fire activity deduced from charcoal analyses had been assumed by some to indicate increased burning by humans, laying a basis for the argument that human-driven environmental change drove the megafaunas’ demise. But more recent work shows that increased burning characterised Sahul long before people arrived.
The loss of a giant flightless bird from south-central Australia at around 50 thousand years ago had been attributed by some to human activity, but it is now clear that its disappearance clearly coincided with escalating climatic variability.
Many questions remain. Humanity’s role in the demise of now-extinct species that were still present when people arrived cannot be entirely discounted, but this remains to be demonstrated. However, it is increasingly clear that the disappearance of megafauna from Sahul took place over tens if not hundreds of millennia under the influence of an inexorable, albeit erratic, climatic ratchet, and that the first Aboriginals made footfall at a time when conditions were already rapidly deteriorating.
Stephen Wroe receives funding from the Australian Research Council.