Should data quality control be done only at the quality control stage? [We believe that should NOT be the case ]
Data quality control is one of the most concerned aspect in research. Managing data quality efficiently even before the quality control stage will significantly improve data usability, plus help reduce wastage of time and effort all along the whole project workflow.
- Does your quality control start only when you receive data submission by your shoppers / surveyors?
- Do you often need to do validation on the results on its accuracy and validity?
- Are you conducting double surveys to validate the same sample?
If your answer is Yes to all of the above, almost a quarter (if not half) of your total project time frame is already used up and your budget for one sample may need to be included in your next project.
6 Ways how Checker integrated features help boost data quality before the quality assurance stage
Maintain good quality control along the processes in the workflow that contributes to desirable, useful end results deliveries :
- Train, test and certify your shoppers / surveyors before they do the actual visit / survey, especially for studies that require in-depth sharing of information and insights.
- This step helps to ensure that the right data delivery is achievable, and useful feedbacks can be collected for further analyses purposes.
- Separate your list of respondents into groups based on their characteristics or tag them according to their past performance or activeness level for example.
- This will be useful for profiles matching during the assignment stages whereby only desirable profiles will go through the profile filtering stages.
- Tag bad performing profiles as blacklisted to avoid repeated dissatisfactory delivery outcome possibilities.
- Control job assignments to only qualified respondent profiles / shoppers by matching only the right person that fits to the job requirements.
- Prioritise those who have passed the training and certification stages, or have shown good performance record inside your database. Allow unsuited profiles to apply for the job only if they pass a qualifying test and able to carry out the profile requirements.
- Include important rules codings (logics, conditions, mandatory questions, proof attachments) when building your questionnaire. This is important for faulty control during the data collection stage later on.
- Validation rules can be in the form of alert messages or pop-up, and it should be triggered whenever quality is compromised – for example response from one question to another related question is not consistent, question left unanswered, incomplete information given for text answers, fail to attach media proofs... and unless the respondent completes the task at that very moment of time, they will not be able to proceed to the next questions / steps.
- Validation rules help ensure completeness and conformity of required data during the fieldwork stage.
- Schedule reminders and notifications to keep your shoppers / surveyors aware on the job assignments and deadline of submissions.
- This helps to push fieldwork to be completed on time, plus having ample time means there will be chances to properly prepare and perform better when the actual survey takes place.
Enforcement of GPS stamping
- Make use of location-based features like GPS detection activation during onsite data collection on mobile with Checker Field Surveyor.
- This is a record of the actual location and time when data collection is happening including texts and media files responses. All responses will be backed up on its authenticity that it is freshly gathered and not re-used data from past surveys.
By applying the above mentioned quality-driven processes, any data collected which are transferred to the quality control department will eventually be more complete, conforming to basic requirements and most importantly, valid and useful for further analysis processes – on top of the quality assurance standards application before final approvals for report generation.
Regardless if you are an early adopter of digital research solutions, or already partially or fully automated your work processes, does your current process help you manage your data quality effectively?