Know a new breed of data analytics solutions using GPU

Know a new breed of data analytics solutions using GPU

Graphical processing units (GPU) are built and developed by chip companies mainly for the video game industry. They contain a single chip processor that re-calculates the 3D renderings every time the scene is changed. This could be performed in the central processing unit (CPU). But these mathematically intensive tasks would put a large strain on the CPU. GPUs are built to do these calculations in parallel and render these images to the monitor.

The graphics card shipped with the laptops or desktops come with one or more GPU integrated into them. Nvidia calls the CPU the brains and the GPU as the soul of the computer. Nvidia defines GPU computing as the use of a GPU together with a CPU to accelerate scientific, analytics, engineering, consumer and enterprise applications.

Since most of the graphical computation involve matrix and vector operations in rendering polygons and for rotation and transformation of polygons and vertices, the same can be leveraged for other massively intensive computational algorithms. Image processing needs are highly complicated and the CPU would not be able to handle the strain to perform such computations in real time. So, data scientists started using the GPU to perform these calculations. It turns out to be a perfect marriage as image processing needs highly parallel computing and the GPU works on parallel thread processing. This means each pixel in the image can be processed to do the same function in parallel. This is very optimal and provides extreme levels of timing gains. Image processing using GPU has been highly successful. Below are a couple of graphics that show the gains achieved using different methods.

10x SPEED-UP ON IMAGE DETECTION USING NEURAL NETWORKS
Dr. Dan Ciresan, Swiss Al Lab IDSIA, Switzerland

WORLD'S LARGEST ARTIFICIAL NEURAL NETWORKS WITH GPU 
Adam Cotes et al Sanford Al Lab, U.S.A.

Now, it is the time for big data, machine learning, and deep learning scientists to put this amazing technology to work. GPU has shown tremendous performance gains in the deep learning fields. Training the algorithm which used to take many days to months have been reduced to hours in some instances. The latest trend around the world is to use the GPU to build supercomputers of the future.

Not all algorithms will perform optimally, so choosing the right algorithm would be crucial. Crunching large data sets for quick analytics and visualizations should be relatively easy using GPU. Big data scientist find this a very optimal approach and see significant gains in performance. Databases developed to work in the GPU can easily host, slice and dice billions of records in milliseconds and also able to render the charts easily.

The norm today is to use the server for the ETL layer, storage, and client to visualize them. Aggregations, filtering, sorting and other functions are performed on the server and the data is sent to the client. The other method is to use desktop applications that use in-memory databases. Creating and maintaining this infrastructure is complex and needs a dedicated team. These methods have been proven out and have stood the test of time. But with the size of data growth and needing an easier solution, using a GPU based solution would give similar or better functionality.

Applications including FEA, Multibody dynamics and other industrial applications that are math intensive can use GPU for better and faster performance gains. I have personally used game physics in a combination of FEA for collision detection and reactions. Will post my experience in the advantages of using game physics in developing industrial simulations shortly.

Disclaimer: Analytics Insight does not provide financial advice or guidance. Also note that the cryptocurrencies mentioned/listed on the website could potentially be scams, i.e. designed to induce you to invest financial resources that may be lost forever and not be recoverable once investments are made. You are responsible for conducting your own research (DYOR) before making any investments. Read more here.

Related Stories

No stories found.
logo
Analytics Insight
www.analyticsinsight.net