2011 IEEE Symposium on Computers and Communications (ISCC)
Download PDF

Abstract

This work proposes a TDMA jitter minimization algorithm to improve the performance of real-time applications in wireless sensor networks. Our algorithms can significantly reduce jitter in a sensor network using a TDMA-based Medium Access Control protocol. The proposed algorithm basically consists of two parts, the first part is jitter minimization. It minimizes jitter in every node for dynamically arrived message streams. The second part is acceptation ratio optimization. It optimizes slot assignment and leads to less number of stream drops. The performance of the proposed algorithm is evaluated over simulations and analyzed theoretically. The results show that the proposed algorithm has better performance, compared with existing greedy and uniform algorithm, in terms of acceptation ratio and jitter.
Like what you’re reading?
Already a member?Sign In
Member Price
$11
Non-Member Price
$21
Add to CartSign In
Get this article FREE with a new membership!

Related Articles