Norms are dynamic entities that exist in open multi-Agent systems to control agents' behavior. For norms to be effective in a domain, agents must first detect the norms' existence. Recent works in this area have proposed mechanisms to endow agents with a norm detection capability. However, existing works in norms detection entail agents to consider only one norm in a particular event. In reality, one event may have more than one set of norms existing in a domain. In a more recent study, a norm detection mechanism reports that more than one norm can co-exist in a society. Thus, we argue that the existing approaches do not help agents to decide which norms to adopt in cases of multiple norms. To solve this problem, we propose the concept of norms trust to help agents to decide which detected norms are trustable from a set of norms that exist in an environment. The proposed trust-based norms evaluation consists of two-Tier assessment; a credible agent evaluation that involves the task of evaluating the neighboring agents' trust and a norms trust evaluation which is the aggregation of norm adoption ratio, norm adoption risk, and norms salience. The evaluation framework is then reinforced with a working scenario to illustrate the process.