English中文русский язык
 
home > logistics
logistics
Improving product quality and service standards and striving for trade docking with multiple countries
storage transport cold chain big data import and export EC solution
big data

Big data refers to data sets that cannot be captured, managed and processed by conventional software tools in a certain time range. It is a huge, high growth rate and diversified information asset that requires a new processing mode to have stronger decision-making power, insight and process optimization ability.

In the Big Data Age [2], written by Victor Mayer-Schoenberg and Kenneth Cookjee, big data refers to the use of all data for analysis without the shortcut of random analysis (sample survey). The 5V characteristics of big data (proposed by IBM): Volume (mass), Velocity (high speed), Variety (diversity), Value (low value density), Veracity (authenticity).

This is the definition given by Gartner, A Research Institute for Big Data. "Big data" is an information asset that needs new processing mode to have stronger decision-making power, insight and process optimization ability to adapt to mass, high growth rate and diversification.

The definition given by McKinsey Global Research Institute is: a data set which is large enough to exceed the capability of traditional database software tools in acquisition, storage, management and analysis. It has four characteristics: massive data scale, rapid data flow, diverse data types and low value density.

The strategic significance of big data technology lies not in mastering huge data information, but in specialized processing of these meaningful data. In other words, if big data is compared to an industry, the key to making profits in this industry lies in improving the "processing ability" of data and realizing the "value-added" of data through "processing".

From a technical point of view, the relationship between big data and cloud computing is as inseparable as the positive and negative sides of a coin. Big data can not be processed by a single computer, so a distributed architecture must be adopted. Its characteristic lies in the distributed data mining of massive data. But it must rely on cloud computing distributed processing, distributed database and cloud storage, virtualization technology.


With the advent of the cloud era, Big data has attracted more and more attention. The analyst team believes that Big data is often used to describe the large amount of unstructured and semi-structured data created by a company, which takes too much time and money to download to relational databases for analysis. Large data analysis is often associated with cloud computing, because real-time large data set analysis requires frameworks like MapReduce to distribute work to dozens, hundreds or even thousands of computers.

Big data requires special techniques to effectively process large amounts of data that can be tolerated over time. Technologies suitable for large data include large-scale parallel processing (MPP) databases, data mining, distributed file systems, distributed databases, cloud computing platforms, the Internet and scalable storage systems.
about us
agriculture
logistics
engineering
overseas
restaurant
marketing
recruit
Wechat Code