The ability of rational agents to work in a team is one of their special characteristics that can help achieve their shared goals. However, in certain conditions, conflict may occur in an agent society; especially if there are selfish agents. A selfish agent places the achievement of its own goal as the main priority in its actions. To strike a balance between the achievement of an agent's own goal and its shared goal, strong moral values should be instilled in software agent environments. Sincerity is one strong human moral values that has been proven to bring many benefits to human society and reduce selfish attitude. By adapting human sincere behaviour, conflicts in software agent society could be mitigated. Sincerity behaviour can also influence a software agent to be more rational in its action and sensitive to its work environment. In our work, we study the situations in which software agents demonstrate sincere behavior in task performances. This paper proposes algorithms of sincere actions of rational agents when completing a given task.