Big data technologies are vital. Their ability to handle extremely large, diverse data sets that, when analysed, can reveal patterns, trends, and associations, especially relating to human behaviour and interactions.

Data Trends

Listed below are the key data trends impacting the big data theme, as identified by GlobalData.

Central governance

Many big data vendors have had to contend with a growing market perception that data governance, security, and management have taken a back seat to accessibility and speed. In response, most companies are now accepting the challenge and openly prioritising data governance. This is expected to result in multiple disparate solutions being replaced by single data management platforms. It may also result in more active partnerships with outside providers or even acquisitions.

Data democratisation

The transformative value of data-driven business insights has led to market demands that data be made available to the widest applicable base of users, enabling them to draw insights through self-service analytics model. This driver began with an emphasis on data consumers and has now expanded to target producers with new tools supporting data analysis and the creation of visualisations.

This trend has already begun to transform the publishing industry, with data journalism going mainstream.

Downstream analytics

Rapid decision-making processes are a key priority for battlefield commanders, and these decisions cannot wait for information to be uploaded to the cloud. For instance, Microsoft has begun to revamp its own platforms to push critical Internet of Things (IoT) analytics functions, such as predictive artificial intelligence (AI) algorithms, downstream to devices.

This will provide armed forces with the ability to disseminate information amongst field users and will provide a critical utility that can be rapidly accessed.

Data integration

Owing predominantly to market demands for data democratisation, enterprise buyers are now in need of data integration and preparation tools capable of retaining access to disparate data sources without sacrificing data quality and security. SnapLogic’s Intelligent Integration platform can replace extract, transform, load (ETL) processes, and recommend the best solutions to help data scientists in organisations.

Data privacy and data protection

Increased regulation around the storage and processing of data is highly likely. It is already underway in Europe in the form of the General Data Protection Regulation (GDPR), which came into force in May 2018.

AI for data quality

One of the benefits of using AI is that it can greatly improve data quality. This improvement is needed within any analytics-driven organisation where the proliferation of personal, public, cloud, and on-premise data has made it nearly impossible for IT to keep up with user demand.

AI-based data visualisation tools, such as Qlik’s Sense platform and Google Data Studio, are enabling enterprises to identify the critical data sets which need attention for business decision-making.

AI-ready data

In order to speed time-to-market for custom-built AI tools, technology vendors are introducing pre-enriched, machine-readable data specific to given industries. For example, the IBM Watson Data Kit for food menus includes 700,000 menus from across 21,000 US cities and dives into menu dynamics such as price, cuisine, ingredients, etc.

Global data centre build-out

The overlapping demands of IoT, cloud computing, streaming media, gaming, and AI are leading to a bottleneck in the data centre market which internet giants such as Amazon, Microsoft, and others are rushing to alleviate. As data sovereignty gains momentum, companies are also looking to invest in edge data centres that are located closer to the data source or the end-user to avoid issues pertaining to data privacy regulations.

Hyperscale data centres

Internet giants such as Amazon, Microsoft, Google, Facebook, Alibaba, Baidu, and Tencent have been bringing hyperscale data centres on stream over the last several years. These are data centres running upwards of 100,000 servers, their software defined equivalents, or vast arrays of cheap bare metal servers. Industrial and commercial giants such as GE, Toyota, Fanuc, and Goldman Sachs are going hyperscale as well.

Cloud computing

As computing moves from in-house corporate data centres to third-party cloud data centres, corporations need to buy less of their own networking gear. Moreover, the big cloud data centres run by the likes of Amazon and Microsoft are increasingly using custom networking kit from smaller suppliers such as Arista Networks. The losers may be the proprietary networking equipment makers such as NETGEAR and Belkin, unless they adapt quickly to the software-led environment that is coming.

This is an edited extract from the Big Data in Defense – Thematic Research report produced by GlobalData Thematic Research.