# Data Parallelism (Task Parallel Library)

Data parallelism refers to scenarios in which the same operation is performed concurrently (that is, in parallel) on elements in a source collection or array. In data parallel operations, the source collection is partitioned so that multiple threads can operate on different segments concurrently.

The Task Parallel Library (TPL) supports data parallelism through the System.Threading.Tasks.Parallel class. This class provides method-based parallel implementations of for and foreach loops (For and For Each in Visual Basic). You write the loop logic for a Parallel.For or Parallel.ForEach loop much as you would write a sequential loop. You do not have to create threads or queue work items. In basic loops, you do not have to take locks. The TPL handles all the low-level work for you. For in-depth information about the use of Parallel.For and Parallel.ForEach, download the document Patterns for Parallel Programming: Understanding and Applying Parallel Patterns with the .NET Framework 4. The following code example shows a simple foreach loop and its parallel equivalent.

Note

This documentation uses lambda expressions to define delegates in TPL. If you are not familiar with lambda expressions in C# or Visual Basic, see Lambda Expressions in PLINQ and TPL.

// Sequential version
foreach (var item in sourceCollection)
{
Process(item);
}

// Parallel equivalent
Parallel.ForEach(sourceCollection, item => Process(item));

' Sequential version
For Each item In sourceCollection
Process(item)
Next

' Parallel equivalent
Parallel.ForEach(sourceCollection, Sub(item) Process(item))


When a parallel loop runs, the TPL partitions the data source so that the loop can operate on multiple parts concurrently. Behind the scenes, the Task Scheduler partitions the task based on system resources and workload. When possible, the scheduler redistributes work among multiple threads and processors if the workload becomes unbalanced.

Note

You can also supply your own custom partitioner or scheduler. For more information, see Custom Partitioners for PLINQ and TPL and Task Schedulers.

Both the Parallel.For and Parallel.ForEach methods have several overloads that let you stop or break loop execution, monitor the state of the loop on other threads, maintain thread-local state, finalize thread-local objects, control the degree of concurrency, and so on. The helper types that enable this functionality include ParallelLoopState, ParallelOptions, ParallelLoopResult, CancellationToken, and CancellationTokenSource.

Data parallelism with declarative, or query-like, syntax is supported by PLINQ. For more information, see Parallel LINQ (PLINQ).

Title Description
How to: Write a Simple Parallel.For Loop Describes how to write a For loop over any array or indexable IEnumerable<T> source collection.
How to: Write a Simple Parallel.ForEach Loop Describes how to write a ForEach loop over any IEnumerable<T> source collection.
How to: Stop or Break from a Parallel.For Loop Describes how to stop or break from a parallel loop so that all threads are informed of the action.
How to: Write a Parallel.For Loop with Thread-Local Variables Describes how to write a For loop in which each thread maintains a private variable that is not visible to any other threads, and how to synchronize the results from all threads when the loop completes.
How to: Write a Parallel.ForEach Loop with Partition-Local Variables Describes how to write a ForEach loop in which each thread maintains a private variable that is not visible to any other threads, and how to synchronize the results from all threads when the loop completes.
How to: Cancel a Parallel.For or ForEach Loop Describes how to cancel a parallel loop by using a System.Threading.CancellationToken
How to: Speed Up Small Loop Bodies Describes one way to speed up execution when a loop body is very small.
Task Parallel Library (TPL) Provides an overview of the Task Parallel Library.
Parallel Programming Introduces Parallel Programming in the .NET Framework.