Ensuring the accuracy of information or data contributed by the crowd is amongst the challenges in crowdsourcing initiatives. Data that do not meet certain criteria set by the crowdsourcer are also submitted by the crowd in a crowdsourcing initiative due to its openness. Thus, there is a need to ensure that only valid data are being captured before the data are being processed further. However, manually validating the data is not practical due to the high volume of data involved in crowdsourcing and their unpredictable nature. Therefore, in this research, an automated algorithm to validate crowdsourced data was developed. The objective was to identify the processes needed to enable the validation of crowdsourced data to be performed automatically. Two types of validation were included; task validation and worker validation. Kuder-Richardson Formula 20 was used to compute validity of task and mean formula converted to percentage was used in computing worker validity. The algorithm was implemented by embedding it in a prototype crowdsourcing application called AsnafCircle that crowdsourced information on eligible asnaf (alms recipient) from the public. Evaluation showed that the algorithm was able to automatically compute values that determine task and worker validity. Evaluation by experts also conformed the necessity of the processes that constitute the algorithm. The presence of this algorithm will help to ensure validity of contributed data in crowdsourcing initiatives, hence, improving their reliability.