Skip to main content

AI can analyze heart scan faster and more accurate than humans


According to a study, a form of artificial intelligence will analyze heart ultrasound tests than more quickly and higher than board-certified echocardiographers. Analysts at the California University trained a computer and tested them against proficient human technicians to assess the foremost common echocardiogram views. The researchers used real-world echo pictures for the advanced machine learning and resulted in finding that the computers precisely assessed ECG videos 91.6 percent to 97.8 percent of the time, compared to 70.3 percent to 83.6 percent once humans reviewed them.

As results steered, this approach will be utilized in helping echocardiographers in rising their potency, accuracy, and workflow. It can also offer a foundation for higher analysis of echocardiographic data. In an echo, various video clips, still pictures and heart recordings are measured from over a dozen completely different angles, or views, many of which can have solely delicate variations. Interpreting medical images, including echocardiograms, typically requires extensive training.

Although deep learning has been utilized to discover abnormalities for medical fields like pathology, radiology, dermatology, it hasn't been widely applied to echocardiograms. This is because of the complexity of their multi-view, multi-modality format. As compared to the earlier machine learning process, which has been applied to echocardiography, the adaptability of training in deep learning has the more significant advantage.

Researchers used pictures from UCSF Medical Center’s patients aged 20-96. Eighty-percent were used for coaching, whereas the remainder were used for validation and testing. Each and every board-certified echocardiographer taking part in the study was given random selected images. The analysts additionally found that the file size may well be reduced without losing accuracy, allowing for less space for storing and easier transmission. They accomplished this by removing color and standardizing the sizes and shapes of videos and still pictures.

Comments

Popular posts from this blog

Does Machines Perceive Human Emotions?

Researchers have developed a machine-learning model that takes computers a step closer to interpreting our emotions as naturally as humans do. In the growing research field of “affective computing”, robots and computers are being developed to analyze facial expressions, interpret our emotions and respond accordingly. Applications include, for instance, monitoring an individual’s health and well-being, gauging student interest in classrooms, helping diagnose signs of certain diseases, and developing helpful robot companions . A challenge, however, is people express emotions quite differently, depending on many factors. General differences can be seen between cultures, genders, and age groups. But other differences are even more fine-grained: The time of day, how much you slept, or even your level of familiarity with a conversation partner leads to subtle variations in the way you express, say, happiness or sadness in a given moment. Human brains instinctively catch these dev...

Market Analysis: Cognitive Computing, recent industry developments

In the ever dynamical world of data technology, business organizations are left with a massive amount of data with them. This data includes very crucial info for business use, however business organizations are solely ready to utilize 200th of whole data accessible with them with the use of traditional data analytics technology. To method and interpret the reaming 80th of the data that's within the form of videos, images, and human voice (also referred to as dark data), there's a requirement of cognitive computing systems. Cognitive computing  systems are a typical combination of hardware and software that constitute natural language processing (NLP) and machine language, and have the capability to collect, process, and interpret the dark data available with business organizations. Cognitive computing systems process and interpret the data in a probabilistic manner, unlike conventional big data analytic tools. However, to cope with the continuously evolving technolog...

Artificial Intelligence to predict possible life forms on other planets

Developments in artificial intelligence might facilitate us to predict the likelihood of life on different planets, according to research team from Plymouth University’s Centre for Robotics and Neural Systems used artificial neural networks (ANNs) that use similar learning techniques to the human brain, so as to estimate the likelihood of extra-terrestrial life on other worlds. It estimates the probability of life in each case, with the apparent potential to play a key role in future heavenly body exploration missions. ANNs are systems that attempt to replicate the way the human brain learns. They are particularly good at identifying patterns that are too complex for a biological brain to process and one of the main tools used in machine learning. As per the AI system the planets are first classified into 5 different types, determined by whether they are most similar to present-day Earth, Venus, Mars or Saturn’s moon Titan. All 5 of these objects are among the most potentia...