We all are astronauts of spaceship earth

These are the latest articles and videos I found most interesting.

  1. How the brain recognizes what the eye sees
  2. Can the brain do back-propagation?
  3. Color & Pattern
  4. Apollo 11
  5. Sustainability

We all are astronauts of spaceship earth

‒ Wubbo Ockels


How the brain recognizes what the eye sees

Salk Institute

New Salk Institute work outlining brain’s visual process could improve self-driving cars and point to therapies for sensory impairments.


Can the brain do back-propagation?

Geoffrey Hinton of Google & University of Toronto

I think it is a good idea to always try to make the data look small by using a huge model, now this relies on you having more almost free computations.

We describe a new learning procedure for networks that contain groups of nonlinear units arranged in a closed loop. The aim of the learning is to discover codes that allow the activity vectors in a “visible” group to be represented by activity vectors in a “hidden” group. One way to test whether a code is an accurate representation is to try to reconstruct the visible vector from the hidden vector. The difference between the original and the reconstructed visible vectors is called the reconstruction error, and the learning procedure aims to minimize this error. The learning procedure has two passes. On the first pass, the original visible vector is passed around the loop, and on the second pass an average of the original vector and the reconstructed vector is passed around the loop. The learning procedure changes each weight by an amount proportional to the product of the “presynaptic” activity and the difference in the post-synaptic activity on the two passes. This procedure is much simpler to implement than methods like back-propagation. Simulations in simple networks show that it usually converges rapidly on a good set of codes, and analysis shows that in certain restricted cases it performs gradient descent in the squared reconstruction error.

The brain processes information through many layers of neurons. This deep architecture is representationally powerful, but it complicates learning by making it hard to identify the responsible neurons when a mistake is made. In machine learning, the backpropagation algorithm assigns blame to a neuron by computing exactly how it contributed to an error. To do this, it multiplies error signals by matrices consisting of all the synaptic weights on the neuron’s axon and farther downstream. This operation requires a precisely choreographed transport of synaptic weight information, which is thought to be impossible in the brain. Here we present a surprisingly simple algorithm for deep learning, which assigns blame by multiplying error signals by random synaptic weights. We show that a network can learn to extract useful information from signals sent through these random feedback connections. In essence, the network learns to learn. We demonstrate that this new mechanism performs as quickly and accurately as backpropagation on a variety of problems and describe the principles which underlie its function. Our demonstration provides a plausible basis for how a neuron can be adapted using error signals generated at distal locations in the brain, and thus dispels long-held assumptions about the algorithmic constraints on learning in neural circuits.

Speaker Abstract and Bio can be found here.

Learning Representation by Recirculation by Geoffrey E. Hinton, James L. McClelland

Random feedback weights support learning in deep neural networks by Timothy P. Lillicrap, Daniel Cownden, Douglas B. Tweed, Colin J. Akerman




Color & Pattern

Allen Institute

Great artists have a keen way of tapping into our visual system. They use color and pattern to stimulate our minds and evoke our emotions. But how?

Watch as Allen Institute researchers visit the new exhibit “Color & Pattern” at Seattle’s Pivot Art +Culture to show us how what our brain *expects* to see can have a strong impact on what it *does* see.

To learn more about the illusions in this video visit:
Cornsweet illusion
Colour Illusion by Lotto and Purves
Checkershadow Description



Apollo 11

1969 – First Man on the Moon – Moon Landing – Neil Armstrong

For All of Mankind – July 16, 1969 – The first landing of men on the Moon – Neil Armstrong and Buzz Aldrin – walking on the moon on July 20, 1969, at 20:18 UTC – by NASA



Sustainability

By US Space Shuttle Challenger 1985, astronaut Wubbo Ockels

We all are astronauts of spaceship earth

The earth is passing by and the most fundamental feeling is to know that you belong there and not in space. You are literally disconnected from your roots, which is flabbergasting and scary at the same time.

Referring back to his space mission, US Space Shuttle Challenger 1985, astronaut Wubbo Ockels (https://www.ockels.nl) expresses his personal experience of being extraterrestrial. The main conclusion of his flight is: Space is not unique, it’s the Earth. Observing Earth from a space ship he concludes that the atmosphere is in fact the hull of spaceship Earth, protecting is from the dangers of space. Humankind needs to act as a crewmember of the Spaceship Earth and commit itself to put the highest priority to maintain the quality of this ship in general and the atmosphere in particular. The post-industrial revolution will, using the technological progress, seek for a happier lifestyle, one without relation to pollution, waste, unfairness, political tensions or war.

Wubbo Johannes Ockels (28 March 1946 – 18 May 2014) was a Dutch physicist and an astronaut of the European Space Agency (ESA)
We all are astronauts of spaceship earth – A farewell to Wubbo Ockels
Wubbo Ockels

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

%d bloggers like this: