Since the phrase “Big Data” went viral, everything related to data sprang up. Web scrapingweb harvestingweb miningdata analysisdata mining, etc. These words have been used interchangeably that make the realm of data get even more confusing for many people. A comprehensive understanding of these terms is necessary for respective businesses to be well-informed in the cutthroat marketing industry. 

What is Data Harvesting?

Data harvesting means getting the data and information from the online resource. It is usually interchangeable with web scraping, web crawling, and data extraction. Collecting is an agricultural term which means to gather ripe crops from the fields which involve the act of collection and relocation. Data harvesting is the process to extract valuable data out of target websites and put them into your database in a structured format. 

To conduct data harvesting, you need to have an automated crawler to parse the target websites, capture valuable information, extract the data and finally export into a structured format for further analysis. Data harvesting, therefore, doesn’t involve algorithms, machine learning, nor statistics. Instead, it relies on computer programming like Python, R, Java, to function. Besides, data harvesting is more about being accurate. 

There are many data extraction tools and service providers that can conduct web harvesting for you. Data BI Design provides full service web harvesting usually at a much better cost than a do it yourself model software.

What is Data Mining?

ata mining is often misunderstood as a process to obtain the data. There are substantial differences between collecting the data and mining the data even though both of them involve the act of extraction and obtaining. Data mining is the process to discover fact-based patterns you generate from a large set of data. Rather than just getting the data and making sense of it, data mining is interdisciplinary, which integrates statistics, computer science, and machine learning.

As and example, the famous Cambridge Analytica Scandal, they collected over 60 million Facebook Users information and isolated out those who were uncertain about their votes based on their identity and activities on Facebook. Cambridge Analytica then employed a  “Psychographic microtargeting” tactic to bombard them with inflammatory messages to shift their votes. It is a typical yet harmful application of data mining. Data mining discovers who they are, what they do, and this information in return helps to achieve the goal. It is taking data on a massive scale to discover patterns for decision making.

Data mining has Four Key Applications. The first one is the classification. Just like the word implies, data mining is used to put things or people into different categories for further analysis. For example, the bank builds up a model of classification through applications. They gather millions of applications along with each individuals’ bank statements, job titles, marital status, school diploma, etc, then use algorithms to calculate and decide which application is riskier than the others. That said, at the moment you fill out the application form and hit submit, they already know what category you belong to, and what loan applies to you.  

Regression

Regression is used to predict the trend based on numerical values from the datasets. It is a statistical analysis of the relationship between variables. For example, you can predict how likely does the crime occur in a specific area based on historical records. 

Clustering

Cluster is to group data points based on similar traits or values. For example, Amazon groups similar products together based on each item’s description, tags, functions for customers to identify easier.

Anomaly detection

It is the process to detect abnormal behaviors which are also called outliers. Banks employ this method to detect unusual transactions that don’t fit into your normal transaction activities. 

Association learning

Association learning answers the question of “how does the value of one feature relate to that of another?” For example, in grocery stores, people who buy soda are more likely to buy Pringles together.  Market basket analysis is a popular application of association rules. It helps retailers to identify the relationships of consuming products.

These four applications build the backbone of Data Mining. So to speak, data mining is the core of the Big Data. The process of data mining is also conceived as Knowledge Discovery from Data (KDD). It illuminates the concept of data science, which helps study research and knowledge discovery. Data can be structured or unstructured and scattered over the internet. The real power is when each piece is grouped, set apart between categories so we can draw a pattern, predict the trends and detect abnormalities.