There are many challenging E3 topics that you will need to understand in order to successfully pass your E3 exam. For example, Big Data is a subject which many students find difficult to grasp at first!
At Astranti we aim to make difficult subjects easy to understand. All of our materials are designed to ensure that our students are given in-depth knowledge of all the key subjects, but in a format that allows you to learn easily and effectively.
As an example we'd like to share with you a new and updated section of our E3 Study Text which explains Dig Data in a straight forward and clear way.
Big data and it's uses
Big
data is a term used to describe sets of data so large that they
simply cannot be analysed and interpreted by standard reporting
facilities. The value of big data is that it allows you to draw from
an enormous amount of different data as opposed to having many
separate sets. As a result it can be possible to identify unusual
business trends and correlations that would otherwise be impossible
to spot.
Big
data has the potential for almost universal application; here are
some examples of big data being implemented in the real world:
-
Used by some hospitals to
monitor patient details and the treatment sought, meaning they can
assess the likelihood of readmission and if high make sure the issue
is resolved there and then thus saving time and money further down
the line.
-
Consumer goods companies
monitoring facebook/twitter and as a result gaining key and an
uninhibited insight into consumer behaviour which
they then use in their marketing campaigns.
-
Governments can use them to
measure crime rates as big data allows the inclusion of many other
factors which in theory can help determine why crime rates are
increasing/decreasing rather than just the fact that they are.
Big
data is a term used to describe sets of data so large that they
simply cannot be analysed and interpreted by standard reporting
facilities. The value of big data is that it allows you to draw from
an enormous amount of different data as opposed to having many
separate sets. As a result it can be possible to identify unusual
business trends and correlations that would otherwise be impossible
to spot.
Big
data has the potential for almost universal application; here are
some examples of big data being implemented in the real world:
- Used by some hospitals to monitor patient details and the treatment sought, meaning they can assess the likelihood of readmission and if high make sure the issue is resolved there and then thus saving time and money further down the line.
- Consumer goods companies monitoring facebook/twitter and as a result gaining key and an uninhibited insight into consumer behaviour which they then use in their marketing campaigns.
- Governments can use them to measure crime rates as big data allows the inclusion of many other factors which in theory can help determine why crime rates are increasing/decreasing rather than just the fact that they are.
Gartner's Three Vs
In
a 2001 research report Gartner
outlined three key challenges faces organisations with their data.
These three elements are:
Volume
- increasing volumes of data mean there is a lot more to manage and
it is harder to extract key information from it
Velocity
– there is an increasing speed of data in and out, which means data
can quickly change. This means that information analysis needs to be
quick to spot and react to the latest change.
Variety
– the range of data types and sources of data can be varied making
analysis difficult. e.g. data in different IT systems in an
organisation being hard to bring together to analyse linkages.
Gartner
then came up with a formal definition of big data related to these
3Vs which is:
In
a 2001 research report Gartner
outlined three key challenges faces organisations with their data.
These three elements are:
Volume
- increasing volumes of data mean there is a lot more to manage and
it is harder to extract key information from it
Velocity
– there is an increasing speed of data in and out, which means data
can quickly change. This means that information analysis needs to be
quick to spot and react to the latest change.
Variety
– the range of data types and sources of data can be varied making
analysis difficult. e.g. data in different IT systems in an
organisation being hard to bring together to analyse linkages.
Gartner
then came up with a formal definition of big data related to these
3Vs which is:
Big data is high volume, high velocity, and/or high variety information assets that require new forms of processing to enable enhanced decision making, insight discovery and process optimisation.
The seven stages of the big data process
The
seven key stages and challenges that make up the big data process are
as follows:
Capture
– What kind
of data is needed and how is going to be captured. This is usually an
indirect source (rather than manual data input), a prime example
would be the barcode reader in a retail outlet.
Storage
– As you
might expect, the amount of data we are talking about cannot be
simply saved on a laptop hard drive. Big data sets can require
physical systems that take up entire rooms or even buildings. In
addition to the sheer size needed both physically and memory-wise you
will need to make sure the systems are adequately protected as you
may have access to private customer information.
Curation
– Once the
data has been captured it then must be organised,
controlled after and
maintained in a way that allows it to be usable and re-usable, an
on-going, day to day upkeep of the data in effect. This
may involve the way it is structured on the system to enable it to be
analysed.
Analysis
– The process of interpreting the data, millions of bits
of info means nothing unless you can use to help answer
questions/illustrate results etc. This could be the ability to
separate the data out by date, product, customer or make linkages
between different types of data e.g. sales made by customer group at
different times of the year.
Visualisation
– The
data which is analysed needs
to be
illustrated in
a clear and digestible format so that it can be used to make
decisions. This may take the form of graphs or condensed simple
tables.
Search
– When you
have as much data as a big data system can compile you must find a
way to search across the vast data landscape to find the info you
want. An example of a search system would be google; which can
accurately search through billions of web pages based on a few key
search terms. Each 'big
data' system needs it's own 'google' type search system to access the
relevant data and help users access relevant information.
Data
Sharing and Transfer – Data must be shared with those
who need it so that relevant people can access the information
produced and indeed relevant information is proactively sent to the
people who can best use the information gained.
Big
data as a strategic resource
Big data is increasingly becoming of strategic
importance. As an example, retailers that understand their customers
and their needs better by analysing big data are able to produce
better products, target marketing campaigns better and price products
in a way that attracts more custom based on past buying patterns.
Together this can provide firms who use Big Data effectively a
competitive advantage.
Astranti Financial Training.
The
seven key stages and challenges that make up the big data process are
as follows:
Capture
– What kind
of data is needed and how is going to be captured. This is usually an
indirect source (rather than manual data input), a prime example
would be the barcode reader in a retail outlet.
Storage
– As you
might expect, the amount of data we are talking about cannot be
simply saved on a laptop hard drive. Big data sets can require
physical systems that take up entire rooms or even buildings. In
addition to the sheer size needed both physically and memory-wise you
will need to make sure the systems are adequately protected as you
may have access to private customer information.
Curation
– Once the
data has been captured it then must be organised,
controlled after and
maintained in a way that allows it to be usable and re-usable, an
on-going, day to day upkeep of the data in effect. This
may involve the way it is structured on the system to enable it to be
analysed.
Analysis
– The process of interpreting the data, millions of bits
of info means nothing unless you can use to help answer
questions/illustrate results etc. This could be the ability to
separate the data out by date, product, customer or make linkages
between different types of data e.g. sales made by customer group at
different times of the year.
Visualisation
– The
data which is analysed needs
to be
illustrated in
a clear and digestible format so that it can be used to make
decisions. This may take the form of graphs or condensed simple
tables.
Search
– When you
have as much data as a big data system can compile you must find a
way to search across the vast data landscape to find the info you
want. An example of a search system would be google; which can
accurately search through billions of web pages based on a few key
search terms. Each 'big
data' system needs it's own 'google' type search system to access the
relevant data and help users access relevant information.
Data
Sharing and Transfer – Data must be shared with those
who need it so that relevant people can access the information
produced and indeed relevant information is proactively sent to the
people who can best use the information gained.
Big
data as a strategic resource
Big data is increasingly becoming of strategic
importance. As an example, retailers that understand their customers
and their needs better by analysing big data are able to produce
better products, target marketing campaigns better and price products
in a way that attracts more custom based on past buying patterns.
Together this can provide firms who use Big Data effectively a
competitive advantage.
Astranti Financial Training.