Skip to main content

GPUs In The Era of Artificial Intelligence


Graphical Processing Units was first developed in 1999 by NVidia and was referred to as the GeForce 256. This GPU model could process 10 million polygons per second and had more than 22 million transistors. The GeForce 256 was a single-chip processor with integrated transform, drawing and BitBLT support, lighting effects, triangle setup/clipping and rendering engines.

Conventionally, computing power is related to the quantity of CPUs and therefore the cores per processing unit. WinTel started to breach the enterprise data center, application performance and information throughput were directly proportional to the number of CPUs and available RAM. Whereas these factors are important to achieving the desired performance of enterprise applications, a new processor began to gain attention – Graphics Processing Unit or GPU.

But within the time of Machine Learning and AI, GPUs found a new place that makes them as applicable as CPUs. Deep learning, and advanced machine learning technique that is heavily utilized in AI and Cognitive Computing. Deep learning powers many synopses including autonomous cars, computer vision, cancer diagnosis, speech recognition, and many of alternative intelligent use cases.

GPUs became more popular as the requirement for graphics applications expanded. Eventually, they became not just an enhancement but a necessity for optimum performance of a computer. The designed logic chip enables the process of quick graphics and video exertion. Generally, the GPU is connected to the CPU and is completely distinctive from the motherboard. The random access memory (RAM) is connected through the accelerated graphics port (AGP) or the peripheral component interconnect express (PCI-Express) bus. Some GPUs are integrated into the northbridge on the motherboard and use the main memory as a digital storage area, but these GPUs are slower and have poorer performance.

Most GPUs use their transistors for 3-D computer graphics. However, some have increased memory for mapping vertices, equivalent to geographic information system (GIS) applications. Some of the more modern GPU technology supports programmable shaders implementing textures, mathematical vertices and accurate color formats. Applications like computer-aided design (CAD) can course over 200 billion operations per second and deliver up to seventeen million polygons per second. Many scientists and engineers use GPUs for additional in-depth calculated studies utilizing vector and matrix features.

Comments

Popular posts from this blog

Does Machines Perceive Human Emotions?

Researchers have developed a machine-learning model that takes computers a step closer to interpreting our emotions as naturally as humans do. In the growing research field of “affective computing”, robots and computers are being developed to analyze facial expressions, interpret our emotions and respond accordingly. Applications include, for instance, monitoring an individual’s health and well-being, gauging student interest in classrooms, helping diagnose signs of certain diseases, and developing helpful robot companions . A challenge, however, is people express emotions quite differently, depending on many factors. General differences can be seen between cultures, genders, and age groups. But other differences are even more fine-grained: The time of day, how much you slept, or even your level of familiarity with a conversation partner leads to subtle variations in the way you express, say, happiness or sadness in a given moment. Human brains instinctively catch these dev...

Market Analysis: Cognitive Computing, recent industry developments

In the ever dynamical world of data technology, business organizations are left with a massive amount of data with them. This data includes very crucial info for business use, however business organizations are solely ready to utilize 200th of whole data accessible with them with the use of traditional data analytics technology. To method and interpret the reaming 80th of the data that's within the form of videos, images, and human voice (also referred to as dark data), there's a requirement of cognitive computing systems. Cognitive computing  systems are a typical combination of hardware and software that constitute natural language processing (NLP) and machine language, and have the capability to collect, process, and interpret the dark data available with business organizations. Cognitive computing systems process and interpret the data in a probabilistic manner, unlike conventional big data analytic tools. However, to cope with the continuously evolving technolog...

Artificial Intelligence to predict possible life forms on other planets

Developments in artificial intelligence might facilitate us to predict the likelihood of life on different planets, according to research team from Plymouth University’s Centre for Robotics and Neural Systems used artificial neural networks (ANNs) that use similar learning techniques to the human brain, so as to estimate the likelihood of extra-terrestrial life on other worlds. It estimates the probability of life in each case, with the apparent potential to play a key role in future heavenly body exploration missions. ANNs are systems that attempt to replicate the way the human brain learns. They are particularly good at identifying patterns that are too complex for a biological brain to process and one of the main tools used in machine learning. As per the AI system the planets are first classified into 5 different types, determined by whether they are most similar to present-day Earth, Venus, Mars or Saturn’s moon Titan. All 5 of these objects are among the most potentia...