future timeline technology singularity humanity
 
   
future timeline twitter future timeline facebook group future timeline youtube channel account videos future timeline rss feed
 
 

Blogs

AI & Robotics Biology & Medicine Business & Politics Computers & the Internet
Energy & the Environment Home & Leisure Military & War Nanotechnology
Physics Society & Demographics Space Transport & Infrastructure

 

 

Archive

2015

 

2014

 

2013

 

2012

 

2011

 

2010

 

 
     
     
     
 
       
   
 
     
 

22nd May 2017

Moon found orbiting third largest dwarf planet

Astronomers have identified a moon orbiting 2007 OR10 – the third largest dwarf planet in our Solar System, and the largest without a name.

 

2007 OR10
Credits: NASA, ESA, C. Kiss (Konkoly Observatory), and J. Stansberry (STScI)

 

The combined power of three space observatories, including NASA's Hubble Space Telescope, has helped astronomers uncover a moon orbiting the third largest dwarf planet, catalogued as 2007 OR10. The pair resides in the frigid outskirts of our Solar System called the Kuiper Belt, a realm of icy debris left over from our Solar System's formation 4.6 billion years ago.

With this discovery, most of the known dwarf planets in the Kuiper Belt larger than 600 miles across are now known to have companions. These bodies provide insight into how moons formed in the young Solar System.

"The discovery of satellites around all of the known large dwarf planets – except for Sedna – means that, at the time these bodies formed, billions of years ago, collisions must have been more frequent, and that's a constraint on the formation models," explains Csaba Kiss of the Konkoly Observatory in Budapest, Hungary. He is the lead author of a science paper confirming the moon's discovery. "If there were frequent collisions, then it was quite easy to form these satellites."

 

dwarf planets
By Lexicon [CC-BY-SA-3.0], via Wikimedia Commons

 

The objects most likely slammed into each other more often because they inhabited a crowded region. "There must have been a fairly high density of objects, and some of them were massive bodies that were perturbing the orbits of smaller bodies," said team member John Stansberry of the Space Telescope Science Institute in Baltimore, Maryland. "This gravitational stirring may have nudged the bodies out of their orbits and increased their relative velocities, which may have resulted in collisions."

But the speed of the colliding objects could not have been too fast or too slow, according to the astronomers. If the impact velocity was too fast, the smash-up would have created lots of debris that could have escaped from the system; too slow and the collision would have produced only an impact crater.

Collisions in the main asteroid belt, for example, are destructive because objects are travelling fast when they smash together. The asteroid belt is a region of rocky debris between the orbits of Mars and the gas giant Jupiter. Jupiter's powerful gravity speeds up the orbits of asteroids, generating violent impacts.

 

asteroid collisions
Credit: NASA/JPL-Caltech

 

The team uncovered the moon in archival images of 2007 OR10 taken by Hubble's Wide Field Camera 3. Observations taken of the dwarf planet by NASA's Kepler Space Telescope first tipped off the astronomers of the possibility of a moon circling it. Kepler revealed that 2007 OR10 has a slow rotation period of 45 hours. "Typical rotation periods for Kuiper Belt Objects are under 24 hours," Kiss said. "We looked in the Hubble archive because the slower rotation period could have been caused by the gravitational tug of a moon. The initial investigator missed the moon in the Hubble images because it is very faint."

The astronomers spotted the moon in two separate Hubble observations spaced a year apart. The images show that the moon is gravitationally bound to 2007 OR10 because it moves with the dwarf planet, as seen against a background of stars. However, the two observations did not provide enough information for the astronomers to determine an orbit.

"Ironically, because we don't know the orbit, the link between the satellite and the slow rotation rate is unclear," Stansberry said.

The astronomers calculated the diameters of both objects based on observations in far-infrared light by the Herschel Space Observatory, which measured thermal emissions of the distant worlds. The dwarf planet is 950 miles across, while its moon is estimated to be about 200 miles in diameter. 2007 OR10, like Pluto, follows an eccentric orbit, but is currently three times farther than Pluto is from the Sun.

2007 OR10 is a member of an exclusive club of nine dwarf planets. Of those, only Pluto and Eris are larger than 2007 OR10. It was discovered in 2007 by astronomers Meg Schwamb, Mike Brown, and David Rabinowitz as part of a survey to search for distant Solar System bodies using the Samuel Oschin Telescope at the Palomar Observatory in California. It is currently the largest known object in our Solar System without an official name. The team has yet to propose a name, but from November 2019 anyone will be able to make a proposal. 2007 OR10 will be farther than both Sedna and Eris by 2045, and will reach aphelion in 2130.

 

2007 or10 solar system map diagram
Credit: Outer Solar System Origins Survey team (OSSOS)

 

---

• Follow us on Twitter

• Follow us on Facebook

• Subscribe to us on YouTube

 

  speech bubble Comments »
 

 

 

22nd May 2017

Farewell to Jacque Fresco (1916-2017)

The creator of the Venus Project has died aged 101.

 

jacque fresco future timeline
By Maj Borg, Minttu Mäntynen, Andrea Miconi [CC BY 1.0], via Wikimedia Commons

 

Jacque Fresco – American futurist, and founder of the Venus Project – has died aged 101. Fresco was known for his futuristic urban designs and sustainable architecture concepts, as well as his advocacy for a resource-based economy.

Fresco's domestic partner and administrative colleague, Roxanne Meadows, who assisted him from 1976 onwards, has released the following statement:

I have received so many letters over the years saying how much people have been inspired by Jacque Fresco. He helped them better understand what was going on in the world, but most of all they expressed that he has given them hope by presenting an alternative society we can work towards to overcome the disastrous conditions we face as a species. Therefore, it is very difficult to let you know that Jacque died peacefully on the morning of May 18, 2017 at 101 years of age. There were many close friends with him the last few days of his life. There will not be a funeral or ceremony held. His body was donated to science as he requested.

Jacque was diagnosed with Parkinson's Syndrome in recent years and this made his directorship and participation with The Venus Project more difficult. During this time I became a full-time caregiver while furthering the work we have done together for the last forty-one years. As the tour seminars became harder for Jacque, I predominately carried them and will continue to do so. As co-founder of The Venus Project, I will now devote more of my time and energy to carrying it forward as Jacque and I have always planned. Many others want to help bring this work into fruition and there is a very dedicated group of people who are doing just that.

The Venus Project will go on towards our aims and proposals and as Jacque and I always say, "If you want a better world you have to work towards it. If you do nothing, nothing will happen." As I see it, we are fortunate to have the lifetime of Jacque Fresco's work to provide a comprehensive direction to move towards; something our world is lacking and desperately needs. So, as always, we need your participation to make it happen. There is lots of work to do! Contact tvp@thevenusproject.com.

In Extensionality,
Roxanne Meadows

 

 

---

• Follow us on Twitter

• Follow us on Facebook

• Subscribe to us on YouTube

 

  speech bubble Comments »
 

 

 

20th May 2017

Solar power is now cheaper than coal in India

The price of clean energy continues to plummet. In India, electricity from solar is now cheaper than electricity from coal.

 

solar cheaper than coal in india
4MW solar power plant in Tamil Nadu, India. By Vinaykumar8687 (Own work) [CC BY-SA 4.0], via Wikimedia Commons

 

At a recent auction held in Rajasthan, northern India, clean energy supplier Phelan Energy and Aaada Power offered to charge just 2.62 rupees per kilowatt-hour (kWh) of solar-generated electricity. That is well below a previous record low bid of 3.15 rupees per kWh, set last month, and far below the 4.34 rupees per kWh offered last year. For comparison, the average charged by India's largest coal firm is currently around 3.20 rupees per kWh.

In recent years, the writing has been on the wall for the coal industry. In May 2014, the Institute for Energy Economics and Financial Analysis (IEEFA) warned that international coal projects relying on imports to India – such as Australia's Galilee Basin – faced major financial risks. Meanwhile, a draft report was issued by India's electricity agency in December last year, predicting that the country was unlikely to need any new coal plants for at least the next decade besides those already in the pipeline. The 50 GW of coal projects being planned would be "largely stranded" under the forecast.

In related news, a 4,000 MW coal "ultra-mega power project" (UMPP) planned for Gujarat, India's westernmost state, has been scrapped this month: "Our focus is now on renewable energy," said energy minister Chimanbhai Sapariya. "The government will encourage solar power."

Like many countries around the world, the cumulative capacity of India's solar grid is following an exponential curve, as costs continue to plummet and technology improves – from 2.65 GW in May 2014, to 6.7 GW in March 2016 and 12.28 GW in April 2017. The Indian government believes that 57% of its electricity capacity will come from non-fossil fuels by 2027. The Paris climate accord target is 40% by 2030. In that same timeframe, it is also hoped that every new car will be an electric car.

In addition to its large-scale, grid-connected solar PV initiatives, India is developing off-grid solar power for local energy needs. The country has a poor rural electrification rate – in 2015, only 55% of all rural households had access to electricity. Solar products are increasingly helping to meet rural needs and reducing the demand for kerosene.

 

solar cheaper than coal in India
© Samrat35 | Dreamstime.com

 

Last year, Prime Minister Modi laid the foundation stone for the headquarters of the International Solar Alliance (ISA) in Gurgaon, just outside New Delhi. This will focus on promoting and developing solar for countries between the Tropic of Cancer and Tropic of Capricorn, reducing the costs and increasing the deployment of these technologies to poor and remote regions. India is particularly well-placed for solar energy with its high solar irradiance.

Alan Fotheringham, a director from energy services company Wood Group, commenting on the auction bid in Rajasthan province, said that India was "demonstrating that when the conditions allow, it is possible to move very quickly to transition the energy mix."

"We have seen strong and steady growth of the solar energy market in India," he said, explaining that it has become "one of the largest and strongest markets. With a huge population and a huge demand for energy, growing investments and political willingness, all the ingredients are there for the transition to happen."

The picture is similar in China, where coal production has been in a downward spiral for several years now, with major efforts being undertaken to accelerate the switch to clean tech. Some Chinese provinces and cities are on a path to reach peak emissions as early as 2020.

Meanwhile, in the USA, President Trump is taking a wrecking ball to most of the progress achieved by his predecessor – gutting the EPA and its power to regulate the fossil fuel industry, scrapping fuel efficiency standards, and withdrawing protections for many areas of land, in order to ramp-up the use of coal, oil and gas. Currently, it is not yet known whether Trump will officially withdraw the USA from the Paris treaty, but he has repeatedly stated his objection to it, his intention to cancel all funding for climate-related efforts, and his denial of the basic science.

---

• Follow us on Twitter

• Follow us on Facebook

• Subscribe to us on YouTube

 

  speech bubble Comments »
 

 

 

18th May 2017

Blood stem cells grown in lab for the first time

Human blood stem cells have been grown in the laboratory for the first time by researchers at Boston Children's Hospital.

 

human blood stem cells technology

 

Researchers at Boston Children's Hospital have, for the first time, generated blood-forming stem cells in the lab using pluripotent stem cells, which can make virtually every cell type in the body. The advance, published in the journal Nature, opens new avenues for research into the root causes of blood diseases and to creating immune-matched blood cells for treatment purposes, derived from patients' own cells.

"We're tantalisingly close to generating bona fide human blood stem cells in a dish," says senior investigator George Daley, PhD, who heads a research lab in Boston Children's Hospital's Stem Cell Program and is dean of Harvard Medical School. "This work is the culmination of over 20 years of striving."

"This is a very big deal," said Carolina Guibentif at the University of Cambridge, who was not involved in the research. "If you can develop [these cells] in the lab in a safe way and in high enough numbers, you wouldn't be dependent on donors."

Although the cells made from the pluripotent stem cells are a mix of true blood stem cells and other cells known as blood progenitor cells, they proved capable of generating multiple types of human blood cells when put into mice.

"This step opens up an opportunity to take cells from patients with genetic blood disorders, use gene editing to correct their genetic defect and make functional blood cells," comments Ryohichi (Rio) Sugimura, PhD, the paper's first author. "This also gives us the potential to have a limitless supply of blood stem cells and blood by taking cells from universal donors. This could potentially augment the blood supply for patients who need transfusions."

 

human blood stem cells technology

 

Since human embryonic stem (ES) cells were first isolated in 1998, scientists have been trying, with little success, to use them to make blood-forming stem cells. During 2007, three groups (including the Daley lab) generated the first induced pluripotent stem (iPS) cells from human skin cells through genetic reprogramming. iPS cells were later used to generate multiple human cell types, such as neurons and heart cells – yet blood-forming stem cells remained elusive.

Sugimura, Daley and colleagues combined two previous approaches. First, they exposed human pluripotent stem cells (both ES and iPS cells) to chemical signals that direct stem cells to differentiate into specialised cells and tissues during normal embryonic development. This generated hemogenic endothelium, an early embryonic tissue that eventually gives rise to blood stem cells, although the transition to blood stem cells had never been achieved in a dish.

In the second step, the team added genetic regulatory factors (called transcription factors) to push the hemogenic endothelium toward a blood-forming state. Starting with 26 transcription factors identified as likely candidates, they eventually came down to just five (RUNX1, ERG, LCOR, HOXA5 and HOXA9) that were both necessary and sufficient for creating blood stem cells. They delivered the factors into the cells with a lentivirus, as used in some forms of gene therapy.

Finally, they transplanted the genetically engineered hemogenic endothelial cells into mice. Weeks later, a small number of the animals carried multiple types of human blood cells in their bone marrow and blood circulation. These included red blood cell precursors, myeloid cells (precursors of monocytes, macrophages, neutrophils, platelets and other cells), and T and B lymphocytes. Some mice were able to mount a human immune response after vaccination.

ES cells and iPS cells were similarly good at creating blood stem and progenitor cells when the technique was applied. But the researchers are most interested in iPS cells, which offer the added ability to derive cells directly from patients and model disease.

"We're now able to model human blood function in so-called 'humanised mice,'" says Daley. "This is a major step forward for our ability to investigate genetic blood disease."

One challenge in making bona-fide human blood stem cells is that no one's been able to fully characterise them: "It's proved challenging to 'see' these cells," says Sugimura. "You can roughly characterise blood stem cells based on surface markers, but even with this, it may not be a true blood stem cell. And once it starts to differentiate and make blood cells, you can't go back and study it – it's already gone. A better characterisation of human blood stem cells and a better understanding of how they develop would give us clues to making bona-fide human blood stem cells."

---

• Follow us on Twitter

• Follow us on Facebook

• Subscribe to us on YouTube

 

  speech bubble Comments »
 

 

 

17th May 2017

World's largest single-memory computer is unveiled

Hewlett Packard Enterprise (HPE) has revealed "The Machine" – a new computing architecture with 160 terabytes of memory.

 

hpe the machine
Credit: HPE

 

Hewlett Packard Enterprise (HPE) has introduced the world's largest single-memory computer. Known simply as "The Machine", it is the largest R&D program in the history of the company, and is aimed at delivering a new paradigm called Memory-Driven Computing – an architecture custom-built for the big data era.

"The secrets to the next great scientific breakthrough, industry-changing innovation or life-altering technology hide in plain sight behind the mountains of data we create every day," explained Meg Whitman, CEO of HPE. "To realise this promise, we can't rely on the technologies of the past. We need a computer built for the big data era."

The prototype unveiled this week features a staggering 160 terabytes (TB) of memory, enough to simultaneously work with the data held in every book in the Library of Congress five times over – or approximately 160 million books. It has never been possible to hold and manipulate whole data sets of this size within a single-memory system, and this is just a glimpse of the immense potential of Memory-Driven Computing.

Based on the current prototype, HPE expects the architecture could easily scale to an exabyte-scale single-memory system and, beyond that, to a nearly-limitless pool of memory – 4,096 yottabytes. For context, that is 250,000 times the entire digital universe today.

With such a vast amount of memory, it will be possible to simultaneously work with every digital health record of every person on earth; every piece of data from Facebook; every trip of Google's autonomous vehicles and every data set from space exploration, all at the same time – getting to answers and uncovering new opportunities at unprecedented speeds.

 

hpe the machine memory driven computing future technology timeline

 

"We believe Memory-Driven Computing is the solution to move the technology industry forward in a way that can enable advancements across all aspects of society," said Mark Potter, CTO at HPE and director, Hewlett Packard Labs. "The architecture we have unveiled can be applied to every computing category from intelligent edge devices to supercomputers."

Memory-Driven Computing puts memory, not the processor, at the centre of the computing architecture. By eliminating the inefficiencies of how memory, storage and processors interact in traditional systems today, Memory-Driven Computing reduces the time needed to process complex problems from days to hours, hours to minutes, minutes to seconds, to deliver real-time intelligence.

The current prototype Machine has its memory spread across 40 physical nodes, each interconnected using a high-performance fabric protocol, with an optimised Linux-based operating system (OS) running on ThunderX2. Photonics/optical communication links, including the new X1 photonics module, are online and operational. Software programming tools are designed to take full advantage of the abundant persistent memory.

"We think that this is a game-changer," said Kirk Bresniker, Chief Architect at HPE. "This will be the overarching arc for the next 10, 20, 30 years."

 

 

---

• Follow us on Twitter

• Follow us on Facebook

• Subscribe to us on YouTube

 

  speech bubble Comments »
 

 

 

16th May 2017

A timeline of the future to 2250

A guest piece by Jakob Coles.

 

timeline of the far future 2250

 

One of our top forumers by post count, Jakob Coles, has provided us with his timeline for 1991 to 2250.

 


 

1991-2060

The Information Age

The Information Age is marked by the rise of the Web and the incursion of the Internet into commerce, entertainment, and communication, eventually eclipsing their offline counterparts. Computers miniaturise and grow ever more powerful – going from clunky boxes, to smartphones, to contact lenses. The Internet of Things arises and every object becomes a smart object. Swarms of smart dust collect mind-boggling amounts of data and the number of Internet-connected devices surges by six orders of magnitude.

American hegemony and globalism mark the beginning of the era – but wane increasingly as time goes on, due to outsourcing, technological unemployment, economic recession, and climate change. Protectionism, isolationism, and nationalism rise in their place. Mainstream media eventually ends up dead in the water; the news is supplied by countless "mad bloggers". The job market is thrown into chaos as robots take billions of jobs, necessitating radical reform of the education system. Militarily, terrorism and guerrilla fighters become more important adversaries than conventional armies, and nations change their doctrines to reflect that. Ground, sea, and air forces alike are mostly replaced with robots; most humans in advanced armies either remotely command robots and drones, or are part of elite cyborg special forces.

The looming spectres of technological unemployment, climate change, resource depletion, and nuclear and biological terrorism hang over the heads of humanity. But colossal strides are being made to counteract these threats. Space-based solar, advanced fission reactors, nuclear fusion, wireless electricity, and smart grids lead to an energy revolution. Vertical farming, desalination, and asteroid mining are demonstrated on an industrial scale. Genetic engineering is extending lives, treating formerly untreatable diseases, and even allowing parents to create designer babies. Cybernetics and neural laces allow humans to gain back the ground lost to robots.

True, conscious AI proves harder than expected, though scientists gradually understand the requirements and begin to close in on an implementation.

Reusable rockets and 3D printing allow humanity to establish the first economically viable colonies on the Moon and Mars, as well as orbital stations. Technology is boosting companies' productivity to astonishing levels. Beyond this point, mankind will either leap to the stars or crash back to Earth.

 

 

2060-2170

The Interplanetary Age

Powerful new technologies such as StarTrams, space elevators, nuclear pulse propulsion, and nuclear Verne guns provide the capability to launch many thousands of tons to orbit in one shot. This allows space colonisation to begin in earnest and opens up Earth orbit to the middle class. People flock into space by the millions, settling in destinations as diverse as the Moon, Mars, asteroid belt, and Lagrangian Points.

The moons of Jupiter and Saturn are colonised by powerful Orion drives, followed a few decades later by the moons of Uranus and Neptune. The gases mined here are just part of a small but steady stream of interplanetary trade.

During this period, many colonies begin to develop a sense of national identity and later rebel against their former home countries. The forces on Earth attempt to fight back, but this is largely a failure, as only the strongest nations have the ability to project military power into cislunar and interplanetary space. Yet on Earth, humanity survives its troubles and later reaches Type 1 status on the Kardashev scale.

Asteroid mining and vertical farming provide resources in greater abundance than their earlier counterparts. Advanced geoengineering has halted but not reversed climate change; Earth is a Hothouse with viable habitation in many parts of Antarctica, Greenland and northern Canada. Genetic engineering is powerful enough to not only revive extinct animals, but engineer entire ecosystems from scratch. Millions of humans have lived past 120; a number of the oldest are past 140.

Contrary to the predictions of 21st century science fiction, Earth is not evolving towards a world government. Some nations merge together, but others break apart, while liberal democracy falls – though illiberal democracy is more common than totalitarian dictatorships. After 2100, the concept gradually disappears from mainstream science fiction. Indeed, decentralisation was a key part of surviving the late 21st century. A raft of localised technologies and new urban planning paradigms allowed cities to attain near-self-sufficiency. Cheap, high-performance 3D printers decentralise manufacturing. Quantum blockchains ease the strain on failing national currencies. Meanwhile, some places take the concept further, establishing colossal arcologies and floating city-states.

India and China replace the US and Europe as the "centre of the world". Genetic engineering creates several new subspecies that are employed for various purposes, as well as reviving old ones like Neanderthals. A number of great apes are uplifted to sapience. And most importantly, AI wakes up. The first truly conscious machines require building-sized computers and fusion plants to support, but within a few decades, they are efficient enough to walk among humans.

By the 22nd century, millions of sapient droids of all shapes and sizes occupy countless roles. Superintelligent, meta-aware "high AIs" arrive soon after, and also miniaturise to human size. The first posthumans are not far behind, aided by primitive cut-and-paste mind uploading – still in the experimental research stage, and with dismal success rates – but colossal benefits for those who survive the process.

 

 

2170-2250

Decolonisation

This is a turbulent era for off-world civilisation, much like the decolonisation of Africa a quarter of a millennium prior. The series of skirmishes known as the Decolonisation Wars on Earth and the Freedom Wars off-world show how even the mightiest Terran powers are unable to defeat even weak off-world enemies. About 90 percent of colonies are forfeited within 50 years; only a few countries retain any at all. Dozens of leaders, many of them transhuman or high AI dictators, arise to fill the vacuum of power.

Interstellar Orion drives establish a settlement at Proxima b, while others are now on their way to Alpha Centauri and Barnard's Star. In the tumult, people spread out across the Solar System, claiming the last uncharted areas. Human structures proliferate everywhere, from colossal solar arrays and entire "cities on wheels" on Mercury, to propellant caches on Pluto. Now the only frontier is interstellar space.

Meanwhile, on Earth, opportunistic superintelligences begin to consolidate power in several backwater nations where there are few or no transhumans to keep them in check. New human subspecies and uplifted animals continue to gain ground, while the nascent high AIs and posthumans learn their abilities and limits.

Arcologies take off in myriad forms including skyscrapers, earthscrapers, artificial islands and many underwater cities. Advanced nano-manufacturing becomes practical and widespread. Mind uploading via in-situ replacement is perfected, greatly aided by high AI/posthuman technology – such as Fast Upload Platforms. Gradually, the Solar System begins to settle down and stabilise, politically and economically.

---

• Follow us on Twitter

• Follow us on Facebook

• Subscribe to us on YouTube

 

  speech bubble Comments »
 

 

 

11th May 2017

Global warming: 1.5°C target may be smashed by 2026

A change to a positive phase of the Interdecadal Pacific Oscillation could see global warming accelerate rapidly in the next decade.

 

global warming future timeline

 

Global temperatures could break through the 1.5°C barrier negotiated at the Paris conference as early as 2026 if a slow-moving, natural climate driver known as the Interdecadal Pacific Oscillation (IPO) has, as suspected, moved into a positive phase.

New research published in Geophysical Research Letters by University of Melbourne scientists at the ARC Centre of Excellence for Climate System Science shows that a positive IPO would likely produce a sharp acceleration in global warming over the next decade.

Since 1999, the IPO has been in a negative phase, but consecutive record-breaking warm years in 2014, 2015 and 2016 have led climate researchers to suggest this may have changed. In the past, these positive phases have coincided with accelerated global warming.

"Even if the IPO remains in a negative phase, our research shows we will still likely see global temperatures break through the 1.5°C guardrail by 2031," said lead author Dr Ben Henley.

"If the world is to have any hope of meeting the Paris target, governments will need to pursue policies that not only reduce emissions, but remove carbon from the atmosphere."

 

carbon capture future timeline
Credit: P Huey/Science/AAAS

 

"Should we overshoot the 1.5°C limit, we must still aim to bring global temperatures back down and stabilise them at that level or lower," adds Henley.

The IPO has a profound impact on our climate, because it is a powerful natural climate lever with a lot of momentum that changes very slowly over periods of 10 to 30 years. During its positive phase, the ocean temperatures in the tropical Pacific are unusually warm and those outside this region to the north and south are often unusually cool. When the IPO enters a negative phase, this situation is reversed.

In the past, we have seen positive IPOs from 1925-1946 and again from 1977-1998. These were both periods that saw rapid increases in global average temperatures. The world experienced the reverse — a prolonged negative phase — from 1947-1976, when global temperatures stalled.

A striking characteristic of the most recent 21st century negative phase of the IPO is that on this occasion, global average surface temperatures continued to rise, just at a slower rate.

"Although the Earth has continued to warm during the temporary slowdown since around 2000, the reduced rate of warming in that period may have lulled us into a false sense of security. The positive phase of the IPO will likely correct this slowdown. If so, we can expect an acceleration in global warming in the coming decades," Dr Henley said.

"Policy makers should be aware of just how quickly we are approaching 1.5°C. The task of reducing emissions is very urgent indeed."

---

• Follow us on Twitter

• Follow us on Facebook

• Subscribe to us on YouTube

 

  speech bubble Comments »
 

 

 

9th May 2017

Growing food in space: no longer a pipe dream

A review of the progress made in space farming shows that food production on the Moon and Mars is likely to become a reality in the not-too-distant future.

 

growing food in space on mars future timeline
Credit: NASA

 

Following a recent NASA bill passed by the US Congress, which authorises $19.5 billion spending for space exploration in 2017, manned missions to Mars are closer to reality than ever before.

As both public and private enterprises gear up towards a return to the Moon and the first human footsteps on the Red Planet, there is a renewed focus on keeping people alive and productive in these extreme environments. Plants, and specifically crop plants, will be a major component of proposed regenerative life-support systems as they provide food, oxygen, scrub carbon dioxide, and aid in water recycling – all in a self-regenerating or 'bioregenerative' fashion. Without a doubt, plants are a requirement for any sufficiently long duration (time and distance wise) human space exploration. There has been a great deal of research in this area – research that has not only made progress towards agriculture in space, but has resulted in many Earth-based advances as well (e.g. LED lighting for greenhouse and vertical farm applications; new seed potato propagation techniques, etc.)

A recent article by Dr. Raymond Wheeler from NASA's Kennedy Space Center provides an informative and comprehensive account of the various international historical and current contributions to bioregenerative life-support and the use of controlled environment agriculture for human space exploration. Covering all of the major developments of international teams, it relates some of this work to technology transfer which proves valuable here on Earth.

Research in the area started during the 1950s and 60s, through the works of Jack Myers and others, who studied algae for oxygen production and carbon dioxide removal for the US Air Force and NASA. Studies on algal production and controlled environment agriculture were also carried out by Russian researchers in Siberia, beginning in the 1960s including tests with human crews whose air, water, and much of their food were provided by wheat and other crops.

In the early 1980s, NASA initiated its Controlled Ecological Life Support Systems (CELSS) program with testing focused on controlled environment production of wheat, soybean, potato, lettuce and sweet potato. Findings from these studies paved the way to conduct tests in a 20 square metre, atmospherically closed chamber at Kennedy Space Center.

At about the same time, Japanese researchers developed a Closed Ecological Experiment Facility (CEEF) in Aomori Prefecture to conduct closed system studies with plants, humans, animals and waste recycling systems. CEEF was much bigger than the NASA program, with 150 m2 of plant growth area, which provided a near-complete diet along with air and water regeneration for two humans and two goats.

In the late 1980s, the European Space Agency MELiSSA Project began and pursued ecological approaches for providing gas, water and materials recycling for space life support, later expanding to include plant testing.

 

cx
Credit: NASA

 

NASA's Biomass Production Chamber (pictured above) operated for 12 years, from 1988 to 2000, at Kennedy Space Center, Florida. The crops tested included wheat, potato, lettuce, soybean, tomato, rice and radish. All crops were grown using hydroponics (nutrient film technique) with higher pressure sodium and/or metal halide lamps. NASA's BPC was one of the first working examples of a vertical agriculture system.

A Canadian research team at the University of Guelph started a research facility for space crop research in 1994. Only a few years later, they went on to develop sophisticated canopy-scale hypobaric plant production chambers for testing crops for space, and have since expanded their testing for a wide range of controlled environment agriculture topics.

More recently, a group at Beihang University in Beijing designed, built and tested a closed life support facility called Lunar Palace 1 (pictured below), which included a 69 m2 agricultural module for air, water and food production for three humans.

Then, in 2015, NASA astronauts harvested a crop of "Outredgeous" red romaine lettuce from the Veggie plant growth system, developed by Orbital Technologies Corporation (ORBITEC) for use aboard the International Space Station (see video at the end of this blog). Once again, this featured LEDs for plant growth.

 

growing food in space on mars future timeline
Lunar Palace 1, China. LEDs grew crops that supported three crew members for 105 days. Credit: Prof. Hong Liu, Beihang University.

 

As a result of these international studies in space agriculture, novel technologies and findings have been produced. This includes the first use of light emitting diodes for growing crops, the first demonstrations of vertical agriculture, use of hydroponic approaches for subterranean crops like potato and sweet potato, crop yields that surpassed record field yields, the ability to quantify volatile organic compound production (e.g. ethylene) from whole crop stands, innovative approaches for controlling water delivery, approaches for processing and recycling wastes back to crop production systems, and more. The theme of agriculture for space has contributed to, and benefited from terrestrial, controlled environment agriculture and will continue to do so into the future. There are still numerous technical challenges – but plants and associated biological systems can and will be a major component of the systems that keep humans alive when we establish ourselves on the Moon, Mars and beyond, says Dr. Wheeler.

The idea of using plants to keep people alive and productive in space is not new, both in concept and in scientific inquiry. Wheeler's article covers a large portion of the historical international research effort that will be the foundation for many of the trade studies and mission design plans for use of bioregenerative life support systems in space.

According to Dr. Gary Sutter, NASA's principal investigator for several spaceflight experiments designed to grow plants in microgravity: "Dr. Ray Wheeler has written a compelling and complete history of the people that have committed their careers to enabling the colonisation of space. Drawing upon his deep understanding of the programs developed, people involved, and progress achieved to highlight the accomplishments and contributions of scientists and engineers around the world to bring the vision of space exploration to fruition, he details the problems, challenges, results and contributions from the programs, and reveals how they benefited Earth, as well as space. The review underscores that the answers will be achieved not through proclamation, but through collaboration between nations, cooperation between people, and sustained commitment by institutions. His article should be required reading for anyone with even a passing interest in space agriculture."

Agriculture for Space: People and Places Paving the Way is available as an open access paper in the journal Open Agriculture.

 

 

 

---

• Follow us on Twitter

• Follow us on Facebook

• Subscribe to us on YouTube

 

  speech bubble Comments »
 

 

 

8th May 2017

Nearly half of jobs in Scotland at risk of automation by 2030

Urgent reform is needed to deal with the rapid rise of automation, a leading Scottish think-tank has said.

 

half of scottish jobs automation 2030

 

Urgent reform is needed to deal with the rapid rise of automation, which threatens nearly half of Scottish jobs by 2030, a leading think-tank has said. The stark warning comes in a new report published by the Institute for Public Policy Research (IPPR) in Scotland, a leading progressive think tank, and supported by the JPMorgan Chase Foundation.

The report, Scotland's Skills 2030, outlines the need to reskill Scotland's workforce for the world of work in the coming decades. With greater numbers of workers working for longer, due to demographic change, and in multiple jobs, multiple careers and for multiple employers, due to technological change, Scotland will need to retrofit the workforce with the skills required to compete in the future.

There are 2.5m working age adults today (78%) that have left compulsory education, that will still be of the working age by 2030, the study notes – adding they are likely to experience significant changes to the economy over this time, and will need support to learn new skills, retrain and upskill.

Meanwhile, just under half (46.1%) of jobs in Scotland, about 1.2m jobs, are at "high risk" of automation over the next couple of decades. The sectors most likely to be affected are transport, manufacturing and retail, the report states. This brings the need for a skills system that is able to work with people in jobs, throughout their careers, rather than solely at the start or before their careers have begun, the researchers warn.

Scotland has a clear gap in training and learning for people who have already started their careers, with a greater focus on younger people, and full-time provision in recent years. Employers are not plugging this gap, and too often pursue a low-skill business model. IPPR Scotland is calling for a new mid-career learning route, called the Open Institute of Technology, to sit alongside apprenticeships and further education, to help train the current workforce to be ready for the future challenges Scotland's economy faces, the report concludes.

Russell Gunson, Director of IPPR Scotland, said: "There are more than 2.5 million people already in the workforce today that will still be working by 2030. There are also 1.2m jobs in Scotland at risk of automation over the same time. Scotland urgently needs to design a skills system better able to work with people already into their careers to help them to retrain, reskill and respond to world of work of 2030.

"Scotland has a really strong record on skills in many ways, and in this report we find that Scotland is the highest skilled nation in the UK. However, our system has a clear gap in that we don't have enough provision for people who have already started their careers, and employers are not investing to fill this gap. To respond to the huge changes facing Scotland around demographic, technological and climate change – and of course Brexit – we're going to have to focus on retrofitting the current workforce to provide them with the skills they need, to deliver the inclusive economic growth we wish to see.

"Our report makes a number of recommendations to help Scotland plot a path through these challenges, to reform the skills system in Scotland, to help to secure an economy that delivers fairness and reduces inequality. Without reform of the skills system we could see changes to the economy harm whole sections of population, and whole communities, leaving many behind."

---

• Follow us on Twitter

• Follow us on Facebook

• Subscribe to us on YouTube

 

  speech bubble Comments »
 

 

 

6th May 2017

Fusion reactor achieves first plasma

A British company, Tokamak Energy, has achieved first plasma in the ST40, its latest prototype design for a fusion reactor. The machine is planned to reach 100 million degrees C in 2018, the temperature required for fusion.

 

fusion reactor future timeline technology 2018
Credit: Tokamak Energy

 

A new prototype fusion reactor has been turned on for the first time and officially achieved first plasma. The reactor aims to produce a record-breaking plasma temperature of 100 million degrees C for a privately-funded venture. This is seven times hotter than the centre of the Sun and the temperature necessary for controlled fusion.

The tokamak reactor, called the 'ST40', was built by Tokamak Energy, one of the world's leading private fusion energy ventures. The Oxfordshire-based company grew out of the Culham Centre for Fusion Energy and was established in 2009 to design and develop small fusion reactors. Tokamak Energy's aim is to put fusion power into the grid by 2030.

With the ST40 up and running, the next steps are to complete the commissioning and installation of the full set of magnetic coils which are crucial to reaching temperatures required for fusion. This will allow the ST40 to produce a plasma temperature of 15 million degrees C – as hot as the Sun's core – in autumn 2017.

 

fusion reactor sun core temperature
Credit: NASA's Goddard Space Flight Center

 

Following the 15 million degree milestone, the next goal is for the ST40 to produce plasma temperatures of 100 million degrees in 2018. This will be a record-breaking milestone, as the plasma will reach a temperature never before achieved in a privately owned and funded fusion reactor. 100 million degrees is an important threshold, as only at or above this temperature can the charged particles which naturally repel each other be forced together to induce a controlled fusion reaction. This will also prove the vital point that commercially viable fusion can be produced in compact spherical tokamaks.

Tokamak Energy's journey to generating fusion energy is moving at a rapid pace; the company has already reached the half-way point of its long-term plan to deliver fusion power. It is focused on working with a smaller reactor design – called a compact, spherical tokamak – that enables quicker development of devices, therefore speeding up the process towards achieving their ultimate targets: producing first electricity by 2025 and commercially viable fusion power by 2030. Tokamak Energy's research has also proven that this route to fusion power can be much faster than the development of conventional large-scale devices.

 

fusion reactor future timeline technology 2018
Credit: Tokamak Energy

 

Dr David Kingham, CEO of Tokamak Energy, commented: "Today is an important day for fusion energy development in the UK, and the world. We are unveiling the first world-class controlled fusion device to have been designed, built and operated by a private venture. The ST40 is a machine that will show fusion temperatures – 100 million degrees – are possible in compact, cost-effective reactors. This will allow fusion power to be achieved in years, not decades."

"We will still need significant investment, many academic and industrial collaborations, dedicated and creative engineers and scientists, and an excellent supply chain. Our approach continues to be to break the journey down into a series of engineering challenges, raising additional investment on reaching each new milestone. We are already half-way to the goal of fusion energy; with hard work we will deliver fusion power at commercial scale by 2030."

 

 

 

---

• Follow us on Twitter

• Follow us on Facebook

• Subscribe to us on YouTube

 

  speech bubble Comments »
 

 

 

5th May 2017

First soft synthetic retina for the visually impaired

The first synthetic retina using soft biological tissues has been created by a student at the University of Oxford.

 

synthetic retina
Credit: Oxford University

 

A synthetic, soft tissue retina developed by an Oxford University student could offer fresh hope to visually impaired people. Until now, all artificial retinal research has used only rigid, hard materials. However, new research by Vanessa Restrepo-Schild, a 24-year-old Dphil student and researcher at Oxford University's Department of Chemistry, is the first to successfully use biological, synthetic tissues, developed in a laboratory. The study could revolutionise the bionic implant industry and the development of new, less invasive technologies that more closely resemble human body tissues, helping to treat degenerative eye conditions.

Just as photography depends on camera pixels reacting to light, our vision relies on the retina performing the same function. The retina sits at the back of the human eye, and contains protein cells that convert light into electrical signals that travel through the nervous system, triggering a response from the brain, ultimately building a picture of the scene being viewed.

Restrepo-Schild led the team in developing a new synthetic, double layered retina that closely mimics the natural human retinal process. The retina replica consists of soft water droplets (hydrogels) and biological cell membrane proteins. Designed like a camera, the cells act as pixels, detecting and reacting to light to create a greyscale image. Restrepo-Schild explains: "The synthetic material can generate electrical signals, which stimulate the neurons at the back of our eye – just like the original retina."

 

2017 synthetic retina vanessa restrepo schild

 

The study, published in Scientific Reports, shows that unlike existing artificial retinal implants, the cell cultures are created from natural, biodegradable materials and do not contain foreign bodies or living entities. In this way, the implant is less invasive than a mechanical device, and is less likely to have an adverse reaction on the body. Miss Restrepo-Schild adds: "The human eye is incredibly sensitive, which is why foreign bodies like metal retinal implants can be so damaging – leading to inflammation and/or scarring. But a biological synthetic implant is soft and water based, so much more friendly to the eye environment."

Of the motivation behind her ground-breaking study, Miss Restrepo-Schild says: "I have always been fascinated by the human body, and want to prove that current technology could be used to replicate the function of human tissues, without having to actually use living cells.

"I have taken the principals behind vital bodily functions, e.g. our sense of hearing, touch and the ability to detect light, and replicated them in a laboratory environment with natural, synthetic components. I hope my research is the first step in a journey towards building technology that is soft and biodegradable instead of hard and wasteful."

Although at present the synthetic retina has only been tested in laboratory conditions, Miss Restrepo-Schild is keen to build on her initial work and explore potential uses with living tissues. This next step is vital in demonstrating how the material performs as a bionic implant.

Restrepo-Schild has filed a patent for the technology and the next phase of work will expand the replica's function to include recognising colours and potentially even shapes and symbols. Looking further ahead, the team will begin to include animal testing and then a series of clinical trials in humans.

---

• Follow us on Twitter

• Follow us on Facebook

 

  speech bubble Comments »
 

 

 

5th May 2017

Biggest X-ray laser in the world generates its first light

The European X-ray Free Electron Laser (XFEL) in Germany has produced its first beams of x-rays.

 

biggest x-ray laser light future timeline
Credit: European XFEL / Heiner Müller-Elsner

 

In Hamburg, Germany, the European XFEL – the biggest X-ray laser in the world – has reached its last major milestone before the official opening in September. The 3.4 km long facility, most of which is located in underground tunnels, has generated its first X-ray laser light. This has a wavelength of just 0.8 nanometres (nm) – about 500 times shorter than that of visible light. At first lasing, this laser had a repetition rate of one pulse per second, which will later be increased to 27,000 per second, compared to the previous record of 120 per second.

The beams of the XFEL are extremely intense and a billion times brighter than conventional synchrotron light sources. The achievable light wavelength corresponds to the size of an atom, meaning that the X-rays can be used to make pictures and films of the "nanocosmos" at atomic-scale resolution – such as of biomolecules, from which better understanding of illnesses could be developed. Other opportunities include research into chemical processes and new catalytic techniques, with the goal of improving their efficiency or making them more environmentally friendly; materials research; or the investigation of conditions similar to the interior of planets.

 

biggest x-ray laser light future timeline
First laser light at the European XFEL, recorded by an X-ray detector at the end of the tunnel. Credit: DESY

 

Helmut Dosch, Chairman of the DESY Directorate, said: "The European X-ray laser has been brought to life! The first laser light produced today with the most advanced and most powerful linear accelerator in the world marks the beginning of a new era of research in Europe. This worldwide unique high-tech facility was built in record time and within budget. This is an amazing success of science. I congratulate all those involved in the research, development, and construction of this facility with passion and commitment: the employees of DESY, European XFEL, and international partners. They have achieved outstanding results and demonstrated impressively what is possible in international cooperation. The European XFEL will provide us with the most detailed images of the molecular structure of new materials and drugs and novel live recordings of biochemical reactions."

The power and speed of the XFEL will make it possible for scientists to investigate more limited samples and perform their experiments more quickly. Therefore, the facility will increase the amount of "beamtime" available, as the capacity at other X-ray lasers worldwide has been eclipsed by demand, and these other facilities are often overbooked.

At the start of September, the X-ray laser should be officially open. At that point, external users can perform experiments at the first two of the eventual six scientific instruments.

---

• Follow us on Twitter

• Follow us on Facebook

 

  speech bubble Comments »
 

 

 

4th May 2017

India will make every new car electric by 2030

India has set itself the goal of making every new car in the country electric by 2030, according to an energy minister.

"We are going to introduce electric vehicles in a very big way," explained Piyush Goyal at the Confederation of Indian Industry Annual Session 2017 in New Delhi. "We are going to make electric vehicles self-sufficient. The idea is that by 2030, not a single petrol or diesel car should be sold in the country."

India's electric car industry will need up to three years of government assistance, Mr Goyal believes, but production of the vehicles would be "driven by demand and not subsidy" after that. Improvements in technology, falling costs of batteries and wider availability of charging stations could achieve this.

"The cost of electric vehicles will start to pay for itself for consumers," he said. "We would love to see the electric vehicle industry run on its own."

India's goal may sound overly ambitious and unrealistic – but electric vehicle sales are growing exponentially worldwide, similar to the rapid trends in solar and wind power that are also being observed. Electric car ownership passed the 1 million mark in 2015. European countries have recently announced similar goals: the Netherlands and Norway, for instance, intend to ban all petrol and diesel cars by 2025.

Mr Goyal said the electric car scheme would first target "larger consumer centres, where pollution is at an all-time high", such as Delhi, which has concentrations of particulate matter 13 times the annual limit set by the World Health Organisation.

The latest available figures show that 2.3 million deaths occur in India each year due to air pollution – almost the same as deaths from tobacco use – with 3% of the country's Gross Domestic Product (GDP) being lost due to this problem, making it a public health and economic crisis. India recently overtook China in number of deaths due to outdoor air pollution.

 

india electric cars 2030 future
The India Gate monument in Delhi. Credit: Steven TDW White.

 

---

• Follow us on Twitter

• Follow us on Facebook

 

  speech bubble Comments »
 

 

 

3rd May 2017

Robot can perform surgeries in one fiftieth of the time

The University of Utah has revealed a new robotic drill system for greatly speeding up surgical procedures. One type of complex cranial surgery could be done in a fiftieth of the normal time, decreasing from two hours to just two and a half minutes.

 

 

 

A computer-driven automated drill, similar to those used to machine auto parts, could play a pivotal role in future surgical procedures. The new machine can make one type of complex cranial surgery 50 times faster than standard procedures, decreasing from two hours to two and a half minutes. Researchers at the University of Utah developed a drill that produces fast, clean and safe cuts – reducing the time the wound is open and the patient is anesthetised, thereby decreasing the incidence of infection, human error, and surgical cost. The findings are reported in Neurosurgical Focus.

To perform complex surgeries – especially cranial surgeries – surgeons typically use hand drills to make intricate openings, adding hours to a procedure: "It was like doing archaeology," said William Couldwell, study author and neurosurgeon at the University of Utah Health. "We had to slowly take away the bone to avoid sensitive structures."

Couldwell saw a need for a device that could alleviate this burden and make the process more efficient: "We knew the technology was already available in the machine world, but no one ever applied it to medical applications."

"My expertise is dealing with the removal of metal quickly, so a neurosurgical drill was a new concept for me," explained A. K. Balaji, associate professor in mechanical engineering. "I was interested in developing a low-cost drill that could do a lot of the grunt work to reduce surgeon fatigue."

 

robot surgery future timeline
Credit: University of Utah

 

The team developed the drill from scratch, as well as new software to calculate the safest cutting path. First, the patient is imaged using CT scans to gather bone data and identify the exact location of sensitive structures, such as nerves, veins and arteries that must be avoided. Surgeons then use this information to program a cutting path for the drill: "The software lets the surgeon choose the optimum path from point A to point B, like Google Maps," says Balaji. In addition, the surgeon can program safety barriers along the cutting path within 1 mm of sensitive structures. "Think of the barriers like a construction zone," says Balaji. "You slow down to navigate it safety."

The translabyrinthine surgery is performed thousands of times a year to expose slow-growing, benign tumours that can form at auditory nerves. This cut must avoid several sensitive features, including facial nerves and the venous sinus, a large vein that drains blood from the brain. Risks of this surgery include loss of facial movement. The system developed at Utah has an automatic emergency shut-off switch. During surgery, facial nerves are monitored for any signs of irritation: "If the drill gets too close to the facial nerve and irritation is monitored, the drill automatically turns off," says Couldwell.

The new drill could reduce the duration of this complex procedure from two hours for hand-drilling by an experienced surgeon to two and a half minutes. The shorter surgery is expected to lower the chance of infection and improve post-operative recovery. It also has potential to substantially reduce the cost of surgery, because it shaves hours from operating room time.

The team has now demonstrated the safety and speed of the drill by performing this complex cut – but Couldwell stresses that it can be applied to many other procedures: "This drill can be used for a variety of surgeries, like machining the perfect receptacle opening in the bone for a hip implant," he said.

The varied application of the drill highlights another factor that drew Balaji to the project: "I was motivated by the fact that this technology could democratise health care by levelling the playing field so more people can receive quality care," he said. The team is now examining opportunities to commercialise the drill to ensure that it is more widely available for other surgical procedures.

---

• Follow us on Twitter

• Follow us on Facebook

 

  speech bubble Comments »
 

 

 

2nd May 2017

A neurotech future will require new human rights laws

New human rights laws are needed to prepare for advances in neurotechnology that may put the 'freedom of the mind' at risk, according to a paper from the Institute for Biomedical Ethics in Switzerland.

 

neurotechnology future timeline

 

New human rights laws to prepare for advances in neurotechnology that may put the 'freedom of the mind' at risk have been proposed in the open access journal Life Sciences, Society and Policy. The authors of the study suggest four new human rights laws could emerge in the near future to protect against exploitation and loss of privacy. The four laws are:

1. The right to cognitive liberty
2. The right to mental privacy
3. The right to mental integrity, and
4. The right to psychological continuity.

Marcello Ienca, lead author and PhD student at the Institute for Biomedical Ethics at the University of Basel, said: "The mind is considered to be the last refuge of personal freedom and self-determination, but advances in neural engineering, brain imaging and neurotechnology put the freedom of the mind at risk. Our proposed laws would give people the right to refuse coercive and invasive neurotechnology, protect the privacy of data collected by neurotechnology, and protect the physical and psychological aspects of the mind from damage by the misuse of neurotechnology."

Advances in neurotechnology, such as sophisticated brain imaging and the development of brain-computer interfaces, have led to these technologies moving away from a clinical setting and into the consumer domain. While these advances may be beneficial for individuals and society, there is a risk that the technology could be misused and create unprecedented threats to personal freedom.

 

 

 

Professor Roberto Andorno, co-author of the research, explained: "Brain imaging technology has already reached a point where there is discussion over its legitimacy in criminal court; for example as a tool for assessing criminal responsibility or even the risk of reoffending. Consumer companies are using brain imaging for 'neuromarketing', to understand consumer behaviour and elicit desired responses from customers. There are also tools such as 'brain decoders' which can turn brain imaging data into images, text or sound. All of these could pose a threat to personal freedom, which we sought to address with the development of four new human rights laws."

The authors explain that as neurotechnology improves and becomes commonplace, there is a risk that the technology could be hacked, allowing a third-party to 'eavesdrop' on someone's mind. In the future, a brain-computer interface used to control consumer technology could put the user at risk of physical and psychological damage caused by a third-party attack on the technology. There are also ethical and legal concerns over the protection of data generated by these devices that need to be considered.

International human rights laws make no specific mention to neuroscience, although advances in biomedicine have become intertwined with laws, such as those concerning human genetic data. Similar to the historical trajectory of the genetic revolution, the authors state that the on-going neurorevolution will force a reconceptualisation of human rights laws and even the creation of new ones.

Marcello Ienca added: "Science fiction can teach us a lot about the potential threat of technology. Neurotechnology featured in famous stories has in some cases already become a reality, while others are inching ever closer, or exist as military and commercial prototypes. We need to be prepared to deal with the impact these technologies will have on our personal freedom."

---

• Follow us on Twitter

• Follow us on Facebook

 

  speech bubble Comments »
 

 

 

1st May 2017

Success in 3D bioprinting of cartilage

Researchers at Sahlgrenska Academy – part of the University of Gothenburg, Sweden – have generated cartilage tissue by printing stem cells using a 3D-bioprinter.

The fact that the stem cells survived being printed in this manner is a success in itself. In addition, the research team was able to influence the cells to multiply and differentiate to form chondrocytes (cartilage cells) in the printed structure. The findings are published in Scientific Reports.

This research project was a collaboration with scientists at Chalmers University of Technology who are experts in the 3D printing of biological materials, as well as orthopaedic researchers from Kungsbacka.

The team used cartilage cells from patients who had recently undergone knee surgery. These cells were then manipulated in a laboratory, causing them to rejuvenate and revert into "pluripotent" stem cells, i.e. stem cells that have the potential to develop into many different types of cells. The stem cells were then expanded and encapsulated in a composition of nanofibrillated cellulose and printed into a structure using a 3D bioprinter. Following printing, the stem cells were treated with growth factors that caused them to differentiate correctly, so that they formed cartilage tissue.

 

3d printed cartilage future timeline technology
Credit: Elin Lindström Claessen

 

"In nature, the differentiation of stem cells into cartilage is a simple process, but it's much more complicated to accomplish in a test tube. We're the first to succeed with it, and we did so without any animal testing whatsoever," says Stina Simonsson, Associate Professor of Cell Biology, who led the research team's three-year effort.

Most of their work involved developing a procedure whereby the cells could survive printing, multiply and then differentiate to form cartilage. One of the key insights gained from their study was that it is necessary to use large amounts of live stem cells to form tissue in this manner.

"We investigated various methods and combined different growth factors," Simonsson explains. "Each individual stem cell is encased in nanocellulose, allowing it to survive the process of being printed into a 3D structure. We also harvested mediums from other cells, which contain the signals that stem cells use to communicate with each other. In layman's terms, our theory is that we managed to trick the cells into thinking that they weren't alone. Therefore the cells multiplied before we differentiated them."

The cartilage formed by stem cells in the 3D bioprinted structure was extremely similar to normal human cartilage. Experienced surgeons who examined the artificial bioprinted tissue saw no difference when they compared it to the real thing, and have stated that the material has properties similar to their patients' natural cartilage. Just like normal cartilage, the lab-grown material contains Type II collagen – and under the microscope, the cells appear to be perfectly formed, with structures similar to those observed in samples of human-harvested cartilage.

 

3d-bioprinted cartilage future technology timeline

 

This study represents a giant step forward in the ability to generate new, endogenous cartilage tissue. In the not-too-distant future, it should be possible to use 3D bioprinting to generate cartilage based on a patient's own, "backed-up" stem cells. This artificial tissue could then be used to repair cartilage damage, or to treat osteoarthritis, in which joint cartilage degenerates and breaks down. The condition is very common – one in four Swedes over the age of 45 suffer from some degree of osteoarthritis.

In theory, this research has created the opportunity to generate large amounts of cartilage, but one major issue must be resolved before the findings can be used in practice to benefit patients.

"The structure of the cellulose we used might not be optimal for use in the human body," adds Simonsson. "Before we begin to explore the possibility of incorporating the use of 3D bioprinted cartilage into the surgical treatment of patients, we need to find another material that can be broken down and absorbed by the body, so that only the endogenous cartilage remains. The most important thing for use in a clinical setting is safety."

---

• Follow us on Twitter

• Follow us on Facebook

 

  speech bubble Comments »
 

 

 

 
     
   
« Previous  
   
     
   

 
     
 

Blogs

AI & Robotics Biology & Medicine Business & Politics Computers & the Internet
Energy & the Environment Home & Leisure Military & War Nanotechnology
Physics Society & Demographics Space Transport & Infrastructure

 

 

Archive

2015

 

2014

 

2013

 

2012

 

2011

 

2010

 

 
 
 
 

 


future timeline twitter future timeline facebook group future timeline youtube channel account videos future timeline rss feed

Privacy Policy