Smart grid produces tremendous volume of data traffic over the grid with each application is being treated with different quality of services (QoS) requirements. High priority applications are allowed to transmit first, followed by medium and low priority applications when the bandwidth is available. However, continuously granting the high priority queue regardless of the request from other queues will cause the high priority queue to monopolize the available bandwidth. In this paper, the universal dynamic bandwidth allocation (UDBA) algorithm is implemented into the smart grid environment to study its performance. The study is done via LabVIEW simulations and the results show that the granted bandwidth is improved by at least 10% since the bandwidth will utilize the excess bandwidth from other queues.