One important component of my live chat is the queue system. If I’m not online or too busy, then web users are automatically put in queue and change of their positions in the queue are continuously reported on a best-effort basis.
Of course I am only one dude today, but the live chat software is built with utility in mind. It is a complete live chat solution that can handle a crazy amount of users on both ends of the server communicating in all directions. The domain model is centered around one key entity: the
Conversation. Of course the subject-based conversation is also a resource that a user can stand in line for, if there’s no more room for strangers. The live chat software can be deployed as a pure help desk solution for companies to lure customers in, or as a peer-to-peer chat application for friends and colleagues, or as an online chat service for strangers that want to hookup and participate in conversations based on subject. Or as any combination thereof.
One of the challenges that faced me was putting things in a queue, based on a magnitude of different resources. To solve that problem I wrote a
ConcurrentDequeManager that transparently manages deques so that client code doesn’t have to. The amount of deques grow and shrink on demand and elements (for example web users) can have their position automatically reported to them as the position changes. Best of all, it is lock-free and superfast. It’s almost a constant-time operation to lookup the size of a deque. In short, client code doesn’t have to worry one bit about concurrency anymore.
Read more and download: https://github.com/MartinanderssonDotcom/ConcurrentDequeManager