Concurrent
Using Future and ConcurrentMap for a lazy cache
In this example I demonstrate how to use ConcurrentMap along with Future to generate a lazy cache. Concurrent maps are a good choice in some situations, where absolute atomicity can be traded off for performance, some level of control over ordering of events has been traded for performance.
Let’s assume for the sake of example that we have an object that is heavy to create. Further, it has to be created on a single thread because it calls into a single threaded maths library. With this in mind we continue to design a lazy queue based cache.
Simple uses of CountDownLatch
CountDownLatch provides a means of waiting for a number of asynchronous events before proceeding. In order to do this one constructs a latch providing the event count. Then one thread would normally call await W
hilst the other thread calls countDown
. Once the count reaches zero the await
call returns and the latch is set. If the call to await
happens after the latch is set it returns immediately.
In our example we need to wait for a thread to initialise before proceeding. We achieve this by creating a count down latch with a count of 1. Once the thread has done its work, it calls countDown on the latch. In the mean time the main thread has continued to do its longJob and then called await on the CountDownLatch instance. Calling await blocks until countDown has been called enough times (in this case once).
Using Semaphore and CyclicBarrier in Java applications
In this example I show an usage of two more concurrent classes CyclicBarrier
and Semaphore
; which are both provided in the core Java library. There's a wealth of concurrency classes built directly into the JVM that can really simplify multi-threaded development.
CyclicBarrrier to make threads wait for alignment.
In the example below, I use a cyclic barrier to make several threads wait for alignment. This is a common example where we have many threads and need to wait for all threads to reach a barrier before proceeding:
BlockingQueue producer consumer example
In this BlockingQueue example we show how to write a very simple producer - consumer with a blocking queue. This example generates SimpleAddition objects that require an addition of two numbers to be performed on the consumer thread. In this case the two values to be added are generated using java.util.Random's nextInt call. They are stored on the queue as a SimpleAddition transfer object and picked up for processing on the consumer thread.
Using thread local in Java
This article assumes that you are already familiar with concurrent programming and design. Java 1.5 introduced the concurrent library java.util.concurrent
, which provides an extensive set of classes for dealing with concurrency issues, these sit alongside some existing classes that have been around since earlier times. I've noticed that some of the classes I mention here don't get used as often as they should, especially given that they are quick wins. This article only scratches the surface of the new classes available, but should provide a starting point for further reading.
Concurrent Maps and CopyOnWriteArrayList
Copy on write lists provide a very quick win in terms of removing synchronization. I believe that a significant use case is providing a thread safe list for java's Listener (AKA observer) pattern. Copy on write lists make the assumption that the list does not update frequently, and is mainly used for reading. If this is not the case, the overhead may be worse than synchronizing.
java.util.concurrent.CopyOnWriteArrayList
implements the List interface and infact is also a random access container (implements RandomAccess
). Once you've established that copy on write is the correct way to go, you need to do no more than change to using this new class. Thats it, really simple.