In the future, our Avatar will be a ‘guinea pig’ to test therapies and interventions

by time news

In the near future it will be our ‘Avatar‘, a digital twin, to go under the knife first. Then, once the best procedure has been established and the contingencies eliminated, it will be up to us. It is the revolution that could change the way of being treated, combining Artificial Intelligence (AI), ‘Deep learning’ and personalized medicine. “From the effects of drugs, to the best therapies, through teleassistance and the stop of experiments on animal guinea pigs, the Avatar will allow us to intervene on many aspects of medicine and increase the quality of the final results for the patient”. He explains it to time.news Salute John Wise, engineer and associate professor of Electronics at the Tor Vergata University of Rome. He has been involved in developing wearable technology projects for years, during the Covid pandemic he launched Voicewise, an application capable of diagnosing pathologies through voice analysis.

How is a ‘virtual twin’ able to help a doctor born? “It was born from my close and long-lasting collaboration with doctors, I’m an engineer and I don’t know some aspects of medicine, but I interact continuously with them and so the techniques are refined, also meeting their requests – says Saggio -. Twenty years ago I made a sensorized glove for the Viterbo local health authority that could measure the patient’s dexterity after the operation, all thanks to a virtual model that could be seen on an early iPhone model. Today we have come to create virtual simulations of parts of organs or an entire organ, but the bottleneck to correctly feed the ‘machine learning’ algorithms is having a database and therefore sources from where to get them”.

“Today – he continues – to create the digital twin we work with models of wearable and even invasive sensors that can be inserted inside the human body. Then there are pervasive sensors that are found in the environment and measure people’s movements and finally those of obliquity who carry out analyzes thanks to the ‘wi-fi’ networks that are now found in every environment. All these sources form the database to feed AI”.

“Together with Paolo Roselli, a mathematician here at Tor Vergata, we are developing specific algorithms that derive from Clifford’s algebra which has the advantage of not being a closed box with input and output data, as some algorithms are today, but each variable a specific weight is given and new algorithms are created”. To create these super algorithms, enormous computing capacity is needed, “calculators with quantum mechanics are needed, but they are already there as very early prototypes”, he points out. “We have to imagine that these algorithms will be able to interpret the gesture of the surgeon who uses a robot in the operating room, they will be able to optimize the best gesture for a specific need and – he observes – the procedure will be achieved with the synthesis of the gestures of the best surgeons in the world”.

But isn’t there the risk of dehumanizing the medical act? “No, because the specialist will always have the ‘handle’ – replies Saggio -. We do not intervene on the doctor-patient relationship but we provide an extra tool. Like when CT scans or the first robots arrived in the operating room”. To get to the patient’s avatar, economic resources are also needed, is there interest from companies on this front? “Yes, but often you already want the prototype – concludes the engineer -. They already want something to offer to the market but this is not always possible. It takes time, research, experimentation to arrive at a result. You can’t have everything and immediately. The system should understand this and help the search”.

You may also like

Leave a Comment