Big Data News

Strengthening digital infrastructure and data sovereignty in Europe

A new report, published by EIT Digital, provides a scenario-based framework for the development of digital infrastructure and data policy instruments to strengthen European...

Supercomputer identifies hepatitis drug for COVID-19 treatment

Using the MOGON II supercomputer, researchers from Johannes Gutenberg University Mainz have identified potential drugs to be used to treat COVID-19. Whilst searching for a...

Solving the storage conundrum to accelerate innovation in life sciences

The Innovation Platform speaks to Dale Brantly, Director of Worldwide Storage Systems Engineering at Panasas, about accelerating innovation in life sciences. An inability to process...

How can Big Data help combat catastrophes?

Researchers from Cornell University have led the effort to use Big Data to combat catastrophes by identifying the risk factors for events such as...

Yara and IBM collaborate to encourage farming data exchange

Yara International and IBM have launched The Open Farm & Field Data Exchange, with aims to facilitates collaboration around farm and field data. Yara International,...

Let’s talk about Earth observations

Climate adaptation requires a broad and integrated approach: GEO is well positioned to deliver in this area due to its experience in gathering knowledge...

Water quality parameters and data visualisation in aquaculture

Meet Blue Unit, the company with the world’s most advanced automatic data visualisation system to monitor water quality parameters. The monitoring of changes in water...

Digital transformation: public sector needs a back to basics approach

What does it take for wide scale digital transformation to succeed in a cash-strapped, highly regulated world, where any...

Cybersecurity and data protection: when mistakes make headlines

Andy Barratt, UK MD at cybersecurity consultancy Coalfire, explores ways to strengthen cybersecurity and data protection policy in the public sector. Everyone makes mistakes. After...

Innovative genomics and bioinformatics research at the Greehey Institute

Dr Siyuan Zheng discusses how the Greehey Institute is continuing to advance the care of children affected by cancer through innovative and translational research. The...

The European Science Cloud initiative: achieving open science in astrophysics and particle physics

The European Union has launched the European Open Science Cloud (EOSC) initiative to support the data driven research in pursuing excellent science. Together, astrophysics and...

Megacity data centres: solving infrastructure issues

Neil Cresswell, CEO at VIRTUS Data Centres, explains the challenges facing megacities, highlighting how these issues can be resolved. The prospect of better employment and...

Big Data

Big data is larger, more complex data sets, especially from new data sources, such extremely large data sets may be analysed computationally to reveal patterns, trends, and associations.

These data sets are so voluminous that traditional data processing software just cannot manage them.

These massive volumes of data can be used to address business problems individuals would not have been able to tackle before.

Big data makes it possible for individuals to gain more complete answers as there is more information available.

More complete answers mean more confidence in the data—which means a completely different approach to tackling problems.