Google announced on Thursday its latest plans for using smartphones to monitor health issues related to the heart and eye. Saying it would test whether capturing heart sounds and eyeball images could help people identify issues from home or not. So, the company is now investigating whether the smartphone’s built-in microphone can detect muffled murmur or heartbeat when placed over a person’s chest area.
Google also revealed that they are looking into how sound waves produced by something such as blood flowing through one of our vital organs travel up towards an individual’s ear canal before being picked up by sensitive microphones.
“This is not a diagnosis but it’s at the level of knowing whether there are elevated risks,” Corrado
Noting questions remained about accuracy. While the eye research focuses on diagnosing diseases such as those related to diabetes from photos and Google reported early promising results using tabletop cameras in clinics; they will now examine if smartphone images can work, too-and their future goal seems like an app where people with health conditions would be able to understand things better through input from doctors combined with machine learning algorithms which provide predictions based off past data.
Google has announced that they will be testing whether their artificial intelligence software can analyze ultrasound screenings taken by less-skilled technicians as long as these workers follow a set pattern. If successful, this could address shortages in higher-skilled workers and allow birthing parents to be evaluated at home without having an attending physician present during the process.
Google’s ambitions to transform the healthcare industry may come up short in revenue and uptake. “We’re not generating enough,” Corrado
Noting that launching capabilities were just one step for them as adoption will take time – though he believes it can eventually lead to major changes across all industries related directly or indirectly with health care.