Future of big data: What are the leading technology trends?

User Review
0 (0 votes)


Future of big data

Future of big data-Big data has emerged in response to the exponential growth in data. This in itself is the result of a combination of technology trends. These include (but are not limited to) the ubiquity of mobile devices, widespread use of social media, and the rise of the Internet of Things (IoT).

Leading big data technology trends

Listed below are the leading big data technology trends, as identified by GlobalData.

Edge computing

Specific use cases of edge computing. Where more data processing is done at the edge of the network, nearer to the data source. It includes the maintenance of data processing and analytics close to points of collection. The growth of edge computing is therefore closely associated with the IoT. The proliferation of enterprise IoT initiatives and consumer IoT offerings (such as automated home devices) will drive demand for edge computing solutions. The deployment of 5G cellular technologies will be a major stimulus for both IoT and edge computing.

Quantum computing

The race to reach quantum supremacy – the point at which a quantum computer can carry out calculations faster than a classical computer ever could – is well underway. Google, IBM, and Microsoft are leading the pack. IBM unveiled the first quantum computer designed for commercial use, the Q System One, in March 2019. AI, and particularly machine learning, will benefit. Quantum computers will complete extremely complex calculations, involving large data sets in a fraction of the time. For effort-intensive AI chores such as classification, regression, and clustering, quantum computing opens a new realm of performance and scale.

AI chips

Central processing units (CPUs) have powered data centres for decades, but new workloads stemming from technology such as AI and the IoT are pushing CPU architectures to their limits. Graphics processing units (GPUs) that were once primarily used for gaming can process many threads in parallel, making them ideal for training and modelling of large predictive data models. As the criteria for data centres moves from calculation speed to search speed, GPUs are moving into data centres. However, while GPUs are ideally suited to training neural networks, field programmable gate arrays (FPGAs) show signs of being better at execution.

Read More Here

Article Credit: Verdict