Wednesday, September 17, 2014

Do you need words to communicate? The Future of Medicine and the Brain-Computer Interface in 2064

This article is part of the 'Think Further' series, sponsored by Fred Alger Management, Inc.  For more 'Think Further' content and videos, visit thinkfurtheralger.com.


Dr. Daniel Kraft on Brain Computer Interface


Dr. Daniel Kraft on Medicine 2064

Computers: Inputs, Outputs and Form Factors

Ever since the invention of computers, humans have developed progressively easier and more intuitive methods to communicate with them.  We have come a long way, from the era of inputting instructions via punch cards and command line interfaces to the current generation of multi-touch screen devices that respond to stylus and voice input. Computer outputs have also evolved from dot matrix printers and cathode ray tubes to thin flat panel displays and hand-held screen capable of full multimedia experience.  Computers are also getting progressively smaller and more wearable in the form of smart eye wear and smart watches that are paired with mobile devices carried by the user in the pocket or handbag.

Where will we be in 50 years?

The next big advance in human computer interface is likely to be "Brain-Computer Interface" (BCI) or communicating with computers using thoughts. While this may seem to be stuff for science fiction, sending instructions to computers using thought control is getting closer to reality.  There have been several instances of early use of this technology with the most famous one being the opening ceremony of the recent FIFA World Cup in Brazil (1).  A young man who was paralyzed from the neck down was fitted with an exoskeleton that was controlled by a computer on his back.  Just by thinking about it, he was able to "tell" the computer to move the exoskeleton to allow him to kick a soccer ball.  Even more amazing was an experiment at the University of Washington where a researcher was able to control the finger movement of a colleague by transmitting his thoughts over the Internet (2).  Both researchers were wearing "thinking caps" connected via computers to the Internet.  There were no electrodes implanted in their brains i.e. non-invasive BCI.

These are proofs of concept that this technology can:
1.  Allow us to input instructions to computers using our thoughts.
2.  Allow computers to transmit signals to our brains to move specific body parts.

By 2064 non-invasive BCI technology may have several applications:


Patients with movement disorders:

There are several diseases where the person can think clearly but cannot control his or her body movements.  Patients with strokes are often left with weakness of parts of their body.  This is also seen in cases of patients with spinal cord injuries and certain presentations of conditions like ALS (Lou Gehrig's disease) and Parkinson's disease.  By wearing appropriate types of exoskeletons or bionic parts that are connected to a computer, their brains would be able to drive the exoskeletons to perform appropriate movements.

Patients with loss of limb or parts of a limb:

People can lose body parts from trauma or due to amputations for cancers or infections.  While prosthetic limbs can allow them to perform movements like walking, most current prosthetics do not allow fine motor control.  It is possible that by the year 2064, brain computer interfaces would allow these prosthetics to replicate almost natural movements of the hand and fingers.

So far we have talked mostly about controlling movement or motor control with BCI.  This is done by using thoughts to input instructions into computers.  The converse - having computers output information into the human brain is more difficult.  This is particularly true of non-invasive BCI - i.e. without implanting electrodes in the brain.  Still the University of Washington experiment described above, showed that this is possible.  Using electrodes implanted in mouse brains, researchers at the Massachusetts Institute of Technology were able to implant false memories into mice (3).  Attempts have been made to send signals directly from cameras into the brains of blind patients to allow them to see crude images.

By 2064, invasive BCI may have several applications:


Patients with sensory deficits particularly blindness:
Blindness is one of the afflictions that humans fear the most.  Researchers at Cornell University have had success in mapping signals from the retina going to the brains in mice via the optic nerve (4).  They have been successful in producing quite realistic images using this method.  Over the next few decades, it should be possible to use this technology to develop functioning prosthetic eyes.  While this will mean that electrodes are not directly stimulating the visual cortex, it is still an example of BCI as the retina and optic nerves are considered extensions of the brain.

Patients with memory deficits:
Caregivers often struggle to help patients with dementia complete some of the required activities of daily living.  These patients may not have motor problems like those described above but need help with activities like getting dressed.  It may be possible that like the University of Washington researchers, caregivers might be able to get their loved ones perform some simple activities by transmitting their thoughts using "thinking caps".


As outlined in these examples, the brain-computer interface has huge potential and with the amount of ongoing research in this area, we should expect to see some real applications in medicine by the year 2064.

Links to references cited in this post:

  1. GeoBeats News: Paraplegic man in Robotic Suit kicks off World Cuphttps://www.youtube.com/watch?v=fZrvdODe1QI
  2. NeuralSystemsLab: Direct Brain-to-Brain Communication in Humans: A Pilot Study https://www.youtube.com/watch?v=rNRDc714W5I
  3. Ramirez et al; Creating a False Memory in the Hippocampus http://www.sciencemag.org/content/341/6144/387
  4. Sheila Nirenberg; A Prosthetic Eye to Treat Blindness; TEDMED 2011 http://www.ted.com/talks/sheila_nirenberg_a_prosthetic_eye_to_treat_blindness


Sunday, August 31, 2014

The Power of the Logo - Judging the Book by the Cover

I read with great interest and hope the story of how teens are moving away from overpriced clothing with large logos -



It seems that during the recent recession, teens began moving away from the high-priced logo clothing of the 3 A's - Abercrombie, American Eagle and AeroPostale'.  They moved to cheaper brands with small or no logos.  Now that the recession is receding, this trend is still persisting.  

This is a big change from a few years ago when I was at a The North Face store (not of my own volition) and overheard this conversation:
Mom: Look at this sweater, it is reasonably priced.
Daughter: I really like the color and design.
Mom: Why don't you try it on?
Daughter: Oh, it does not have a North Face logo! Lets look for something else.
The peer pressure and desire to conform is so strong among teens that it is hard to believe that this antilogo trend is anything but a transient or a superficial phenomenon.  Even reasonably savvy adults are driven by these forces when they purchase items like smartphones, laptops and cars. 

Very early on, we are told not to judge the book by the cover.  But we continue to believe that others will judge us by our covers, the logos we adorn ourselves with.  We create our "Pseudo" identities on social media. We assume we can buy respect or admiration of our peers by purchasing a status symbol not realizing that all we generate is jealousy or diminished respect for our decision-making skills or superficiality.  It clearly will not buy us happiness unless the product is something you really need and you would buy even if it did not have a logo.

Will we ever get to the point that people would stop and think about these words of  Thích Nhất Hạnh, "To be beautiful means to be yourself. You don’t need to be accepted by others. You need to accept yourself.” 
― 

Wednesday, August 13, 2014

The Robots are coming! The Robots are coming!

This is a video making waves on YouTube - it was published earlier today and already has >400,000 views.

The video caught my attention for multiple reasons

  • It is a beautifully created mashup - I did a workshop on creating video mashups (with +Ali Reza Jalali+Anne Marie Cunningham and +Natalie Lafferty)  using publicly available resources but this is awesome in its quality.  Our slideshare of the workshop is available here.
  • I have been working on the IBM Watson - Cleveland Clinic collaborative project, helping Watson learn medicine.  The goal is to improve the quality of care we provide to our patients.  I wonder if some day we will look back and wonder about the future we might have helped create.  This very funny video interview with Stephen Hawking does leave one with a vaguee sense of unease.

    • But most of all, even if this future is somewhat far away, are we preparing our students for this?  What will be the role of humans and how can we create learning environments that will help our students adapt to this?  Where will the integration of artificial intelligence, natural language processing, robotics, the Internet of Things leave the humans?