Every single year, around 25-30% of data becomes inaccurate leading to less effective sales and marketing campaigns. In most businesses, it is reported that about 40% of lead data is inaccurate in some way making it virtually impossible for businesses to achieve their full potential.
The amount of data we produce each day is truly mind-boggling at 2.5 quintillion bytes at our current pace which is only set to increase with new developments in Internet of Things (IoT), quantum, edge and cloud computing. To put that into context, Google processes 40,000 searches every second with over 5 billion searches per day across all providers. Beyond that, every minute users watch over 4 million YouTube videos, send 456,000 tweets and post 47,000 photos on Instagram. All of this is before we even begin to talk about text messages, emails, Skype calls and Tinder swipes!
On the B2B Side, marketers and sales reps are facing job title and company changes from contacts who change jobs every couple of years, companies moving locations and being acquired or close doors. So how can you keep track of all those changes!
With all of that in mind, it’s no wonder businesses see inaccuracies, and this is the reason why data quality if becoming a strategic priority.
Data Quality Definition
Data quality can be defined simply for having a good data accuracy.
In less ambiguous terms, quality data is useful data which is consistent and always clear. This includes making sure formats are alike, duplicates are cleared, addresses, phone numbers and emails are verified and typos are corrected to ensure the stored information is valuable to business activities.
It is difficult to provide a data quality definition that suits all but the main activities revolve around logically rationalizing the information stored within the business. Data is generally thought to be of high quality if it is able to meet the following terms:
Accurate –contact details up-to-date so you can get in touch with customers and prospects.
Relevant – whether or not there is a use for the data in the business. For example, you may have a government agency in your list of prospect companies when you are selling to tech companies primarily.
Completeness – data should be stored for all pertinent information. For example, the full address and not just parts of it.
Easily understood – everybody in the business should be able to interpret the data.
Uniqueness – there should be a single customer view of the data without duplication
Timeliness – the data should represent reality from a required point in time. In other words data should be current.
Consistency – the data needs to be consistency across disparate sources if relevant to the business.
Standardized – Certain fields should be standardized to make sure that you can pull relevant data with easy and not spend days data mining fields. Example of standardized fields includes industry values, job titles, states and countries.
Whilst there isn’t a standard data quality definition, if organizations try to adhere to all of these factors, it will directly correlate with their ability to make the correct decisions. If you are looking to increase your data quality, it is advisable to create a data quality report to monitor your data and get a data audit to see the state of your data. Free data audit is available from StrategicDB.