Advanced Synchronization Techniques
Multithreaded applications often use wait handles and monitor objects to synchronize multiple threads. These sections explain how to use the following .NET Framework classes when synchronizing threads: AutoResetEvent, Interlocked, ManualResetEvent, Monitor, Mutex, ReaderWriterLock, Timer, WaitHandle.
Wait handles are objects that signal the status of one thread to another thread. Threads can use wait handles to notify other threads that they need exclusive access to a resource. Other threads must then wait to use this resource until the wait handle is no longer in use. Wait handles have two states, signaled and nonsignaled. A wait handle that is not owned by any thread is in the signaled state. A wait handle that is owned by a thread is in the nonsignaled state.
Threads request ownership of a wait handle by calling one of the wait methods, such as WaitOne, WaitAny, or WaitAll. The wait methods are blocking calls that are similar to the Join method of an individual thread:
If no other thread owns the wait handle, the call immediately returns True, the wait handle's status is changed to nonsignaled, and the thread that owns the wait handle continues to run.
If a thread calls one of a wait handle's wait methods, but the wait handle is owned by another thread, the calling thread will either wait for a specified time (if a time-out is specified) or wait indefinitely (if no time-out is specified) for the other thread to release the wait handle. If a time-out is specified and the wait handle is released before the time-out expires, the call returns True. Otherwise, the call to wait returns False, and the calling thread continues to run.
Threads that own a wait handle call the Set method when they are done or when they no longer need the wait handle. Other threads can reset the status of a wait handle to nonsignaled by either calling the Reset method or by calling WaitOne, WaitAny, or WaitAll and successfully waiting for a thread to call Set. AutoResetEvent handles are automatically reset to nonsignaled by the system after a single waiting thread has been released. If no threads are waiting, the event object's state remains signaled.
Mutex objects are synchronization objects that can be owned by only a single thread at a time. The name "mutex" is derived from the fact that ownership of mutex objects is mutually exclusive. Threads request ownership of the mutex object when they require exclusive access to a resource. Because only one thread can own a mutex object at any time, other threads must wait for ownership of a mutex object before using the resource.
The WaitOne method causes a calling thread to wait for ownership of a mutex object. If a thread terminates normally while owning a mutex object, the state of the mutex object is set to signaled and the next waiting thread gets ownership.
Synchronization events notify other threads that something has occurred or that a resource is available. Despite the term including the word "event," synchronization events are unlike other Visual Basic events—they are really wait handles. Like other wait handles, synchronization events have two states, signaled and nonsignaled.
Threads that call one of the wait methods of a synchronization event must wait until another thread signals the event by calling the Set method. There are two synchronization event classes: ManualResetEvent and AutoResetEvent.
Threads set the status of ManualResetEvent instances to signaled using the Set method. Threads set the status of ManualResetEvent instances to nonsignaled using the Reset method or when control returns to a waiting WaitOne call.
Instances of the AutoResetEvent class can also be set to signaled using Set, but they automatically return to nonsignaled as soon as a waiting thread is notified that the event has become signaled.
Monitor Objects and SyncLock
Monitor objects are used to ensure that a block of code runs without being interrupted by code running on other threads. In other words, code in other threads cannot run until code in the synchronized code block has finished.
Suppose, for example, that you have a program that repeatedly and asynchronously reads data and displays the results. With operating systems that use preemptive multitasking, a running thread can be interrupted by the operating system to allow time for some other thread to run. Without synchronization, it is possible that you could get a partially updated view of the data if the object that represents the data is modified by another thread while the data is being displayed. Monitor objects guarantee that a section of code will run without being interrupted. Visual Basic provides the SyncLock and End SyncLock statements to simplify access to monitor objects. Visual C# uses the Lock keyword in the same way.
Access to an object is locked out only if the accessing code is contained within a SyncLock block on the same object instance.
For information on the SyncLock statement, see SyncLock Statement
You can use the methods of the Interlocked class to prevent problems that can occur when multiple threads attempt to simultaneously update or compare the same value. The methods of this class let you safely increment, decrement, exchange, and compare values from any thread. The following example shows how to use the Increment method to increment a variable that is shared by procedures running on separate threads.
Sub ThreadA(ByRef IntA As Integer) System.Threading.Interlocked.Increment(IntA) End Sub Sub ThreadB(ByRef IntA As Integer) System.Threading.Interlocked.Increment(IntA) End Sub
In some cases, you may want to lock a resource only when data is being written and permit multiple clients to simultaneously read data when data is not being updated. The ReaderWriterLock class enforces exclusive access to a resource while a thread is modifying the resource, but it allows non-exclusive access when reading the resource. ReaderWriter locks are a useful alternative to exclusive locks, which cause other threads to wait, even when those threads do not need to update data.
Thread synchronization is invaluable in multithreaded applications, but there is always the danger of creating a deadlock, where multiple threads are waiting for each other and the application comes to a halt. A deadlock is analogous to a situation in which cars are stopped at a four-way stop and each person is waiting for the other to go. Avoiding deadlocks is important; the key is careful planning. You can often predict deadlock situations by diagramming multithreaded applications before you start coding.