Abstract
This work proposes a TDMA jitter minimization algorithm to improve the performance of real-time applications in wireless sensor networks. Our algorithms can significantly reduce jitter in a sensor network using a TDMA-based Medium Access Control protocol. The proposed algorithm basically consists of two parts, the first part is jitter minimization. It minimizes jitter in every node for dynamically arrived message streams. The second part is acceptation ratio optimization. It optimizes slot assignment and leads to less number of stream drops. The performance of the proposed algorithm is evaluated over simulations and analyzed theoretically. The results show that the proposed algorithm has better performance, compared with existing greedy and uniform algorithm, in terms of acceptation ratio and jitter.