Traditional methods of neurological rehabilitation for those who have suffered a stroke or spinal cord injury and lost movement in their arms and hands include exercise programmes and activities intended to improve movement. However, this process can take months or even years, depending on the severity of the injury, and frequent hospital visits for patients.

What if there was a way to evolve the standard processes for neurological rehabilitation by uniting sophisticated technologies into a single system, enabling the restoration of movement in real-time for patients living with neurological conditions, or to independently chart their course of recovery over the long term? 

The Synapsuit Project, spearheaded by the Wyss Center, has set out to achieve this aim by uniting brain interfacing technology with AI and a muscle stimulation exosuit to capture and translate brain signals for movement intentionality.

The exosuit, which is designed to be worn for multiple years, uses captured brain data for muscle stimulation, effectively serving as a bridge between the brain and the body.

Neurosoft Bioelectronics, which specialises in soft bioelectronic devices used for brain interfacing, spun off five years ago from the Swiss Federal Institute of Technology (EPFL) in Lausanne, Switzerland.

The company recently partnered with the Wyss Center for its Synapsuit Project alongside partners including the Korea Electronics Technology Institute (KETI) and the Swiss Innovation Agency.

How well do you really know your competitors?

Access the most comprehensive Company Profiles on the market, powered by GlobalData. Save hours of research. Gain competitive edge.

Company Profile – free sample

Thank you!

Your download email will arrive shortly

Not ready to buy yet? Download a free sample

We are confident about the unique quality of our Company Profiles. However, we want you to make the most beneficial decision for your business, so we offer a free sample that you can download by submitting the below form

By GlobalData
Visit our Privacy Policy for more information about our services, how we may use, process and share your personal data, including information of your rights in respect of your personal data and how you can unsubscribe from future marketing communications. Our services are intended for corporate subscribers and you warrant that the email address submitted is your corporate email address.

According to GlobalData’s medical device pipeline database, 407 Neurological Diagnostic and Monitoring Equipment devices are in various stages of development globally. Digital health technologies in neurology are estimated to grow, and forecasts that collaborations and investment will continue to drive digital trends.

Medical Device Network sat down with Dr Nicolas Vachicouras, Neurosoft’s CEO and co-founder to understand more about the soft electrodes they have been developing and how their device is involved in the Wyss Center’s Synapsuit Project.

Ross Law (RL): What does Neurosoft do?

Nicolas Vachicouras (NV): We develop neuro-interfaces, devices that are in contact with the brain or spinal cord, to record activity from the brain in a similar manner to how an electrocardiogram (EcG) records signals in the heart. Our devices can also electrically stimulate the brain in a similar way to how a pacemaker stimulates the heart.

RL: How has Neurosoft aimed to approach brain interfacing differently?

NV: Many of the devices available on the market today for brain interfacing are made of quite large, stiff platinum discs used to stimulate the spinal cord, for example, in addressing chronic pain or for epilepsy detection.

The lab I came from realised these interfaces were not ideal to interface with the brain, and the reason for this is that the brain and spinal cord are very soft and prone to a lot of movement. The spinal cord is obvious – you have a lot of movement, especially at the level of the neck, but even the brain is constantly moving slowly. Coming into a soft-moving environment with a very rigid device can risk damaging the brain, and can result in a foreign body reaction, in which the body attacks devices and can eventually cause them to stop working.

These factors motivated the development of what we call soft electrodes, devices which are not only flexible but also stretchable. You can think of our device as a membrane with sensors, which come at the surface of brain tissue and can record or stimulate the brain.

The elastic properties of our device allow for safer brain interfacing and better contact with the brain, with less risk of damaging blood vessels or the brain. Our soft electrodes allow for placement in the brain’s valley-like sulci. Some regions to decode specific hand movements or even other parts of the body are encoded in the brain within these regions.

In the context of the Synapsuit Project, the idea is to leverage our technology to provide high-resolution recordings from the human brain, and this is where we have other modules that come from different partners to the project.

RL: Can you explain how the different components of the Synapsuit unify with one another?

NV: You start by having a patient who cannot move their arms or hands, and typically this is due to a stroke or spinal cord injury. Their brain still functions. They are still able to think, but the information does not travel past the neck and so nothing happens.

The Synapsuit starts with sensors on our electrodes picking up an intention. If a patient is thinking of moving their arm, for example, even if they are not able to, you still have the activity of the brain that corresponds to ‘I want to move my left arm’.

This information is then transmitted to a head-mounted device developed by the Wyss Center called the Active Brain Implant Live Information Transfer sYstem (ABILITY), which connects to our electrodes to process brain data and transmit it digitally outside of the body. This is the second block.

Another block, also developed by the Wyss Centre, is artificial intelligence (AI). Every single sensor on our electrode device provides brain data, which is then translated by the AI towards actual functions.

The information interpreted by the AI is then sent to the exoskeleton, which is done by KETI. The goal here is to use functional electrical stimulation – a non-invasive practice which provides stimulation through the skin – to activate the right muscles based on the information from the brain. This piece effectively serves as a bridge between the brain and the arms.

RL: Can you tell us about the clinical trials you took part in last August?

NV: It is important to note that the project is at its beginning, so the building blocks are currently being developed on their side and the plan is to merge them eventually. Last year’s clinical study, which took part in Houston, Texas, was just our device by itself, and here we showed that it was able to safely interface with the human brain and record high-quality brain data. In the test, our device was used to record the brain activity of patients with epilepsy. It’s not clinically relevant to the Synapsuit Project and was more about being able to record quality data from the human brain.

RL: What are the aims of your next clinical trial?

NV: Our goal in the upcoming weeks is to start recording brain data from patients who are moving their arms. To do this, we are starting with healthy patients, in the sense they have a connection between their brain and their body. We’re going to put a glove on their hand. The glove, containing a trained sensor, is going to detect all the movement. We will ask participants to move their hands and elbows for around ten minutes, with different movements, and then we will record the activity from the brain. We will know exactly what is supposed to be happening in real life based on the signals, and then we will see how well the AI can decode this information.

The other technical challenge lies in how well we can connect our device to Wyss Center’s ABILITY system from a mechanical and electrical standpoint in a way that is reliable long term; because the idea is that you implant this device not just for a few months, but for 20 years.

In turn, KETI are currently showing that they can stimulate different muscles to activate the likes of the right finger, or the left hand, in a completely independent way. Eventually, we’ll combine both technologies and control their wearable parts with the inputs gathered from our device in the brain.

RL: What are your expectations following this next trial?

NV: In around two years, we want to have shown at least one prototype of the full system together. After that, we can hopefully launch bigger studies, on larger populations of patients, with the full system.