1. Reasons for using threads
1. Threads can be used to isolate code from other codes and improve the reliability of applications.
2. Threads can be used to simplify coding.
3. Threads can be used to achieve concurrent execution.
2. Basic knowledge
1. Process and thread: As the basic unit of operating system execution program, process owns the resources of the application. The process contains threads. The resources of the process are shared by threads, and threads do not own resources.
2. Foreground thread and background thread: New threads created through the Thread class default to the foreground thread. When all foreground threads are closed, all background threads will also be terminated directly without throwing an exception.
3. Suspend (Suspend) and wake-up (Resume): Since the execution order of threads and the execution of the program are unpredictable, using suspend and wake-up is prone to deadlock, and should be used as little as possible in actual applications.
4. Blocking thread: Join, blocking the calling thread until the thread terminates.
5. Terminate the thread: Abort: Throw a ThreadAbortException exception to terminate the thread. The terminated thread cannot be awakened. Interrupt: Throws a ThreadInterruptException exception to terminate the thread, and execution can continue by catching the exception.
6. Thread priority: AboveNormal BelowNormal Highest Lowest Normal, the default is Normal.
3. The use of threads
Thread functions are passed through delegation. They can be passed without parameters or with parameters (only one parameter). The parameters can be encapsulated in a class or structure.
namespace Test { class Program { static void Main(string[] args) { Thread t1 = new Thread(new ThreadStart(TestMethod)); Thread t2 = new Thread(new ParameterizedThreadStart(TestMethod)); t1.IsBackground = true; t2.IsBackground = true; t1.Start(); t2.Start("hello"); Console.ReadKey(); } public static void TestMethod() { Console.WriteLine("不带参数的线程函数"); } public static void TestMethod(object data) { string datastr = data as string; Console.WriteLine("带参数的线程函数,参数为:{0}", datastr); } } }
4. Thread pool
Since the creation and destruction of threads requires a certain amount of overhead, excessive use of threads will cause a waste of memory resources. For performance considerations, the concept of thread pools was introduced. The thread pool maintains a request queue. The thread pool code extracts the task from the queue and then delegates it to a thread in the thread pool for execution. The thread will not be destroyed immediately after execution, so that tasks can be executed in the background and thread creation and destruction can be reduced. the expenses incurred.
Thread pool thread defaults to background thread (IsBackground).
namespace Test { class Program { static void Main(string[] args) { //将工作项加入到线程池队列中,这里可以传递一个线程参数 ThreadPool.QueueUserWorkItem(TestMethod, "Hello"); Console.ReadKey(); } public static void TestMethod(object data) { string datastr = data as string; Console.WriteLine(datastr); } } }
5. Task class
It is very simple to use the QueueUserWorkItem() method of ThreadPool to initiate an asynchronous thread execution, but the biggest problem with this method is that there is no built-in mechanism to let you know when the operation is completed. Is there a built-in mechanism? The built-in mechanism obtains a return value after the operation is completed. For this purpose, you can use the Task class from System.Threading.Tasks.
Construct a Task
namespace Test { class Program { static void Main(string[] args) { Task<Int32> t = new Task<Int32>(n => Sum((Int32)n), 1000); t.Start(); t.Wait(); Console.WriteLine(t.Result); Console.ReadKey(); } private static Int32 Sum(Int32 n) { Int32 sum = 0; for (; n > 0; --n) checked{ sum += n;} //结果太大,抛出异常 return sum; } } }
When a task is completed, automatically start a new task.
After one task is completed, it can start another task. The previous code is rewritten below without blocking any threads.
namespace Test { class Program { static void Main(string[] args) { Task<Int32> t = new Task<Int32>(n => Sum((Int32)n), 1000); t.Start(); //t.Wait(); Task cwt = t.ContinueWith(task => Console.WriteLine("The result is {0}",t.Result)); Console.ReadKey(); } private static Int32 Sum(Int32 n) { Int32 sum = 0; for (; n > 0; --n) checked{ sum += n;} //结果溢出,抛出异常 return sum; } } }
6. Asynchronous execution of delegates
Asynchronous calls of delegates: BeginInvoke() and EndInvoke()
namespace Test { public delegate string MyDelegate(object data); class Program { static void Main(string[] args) { MyDelegate mydelegate = new MyDelegate(TestMethod); IAsyncResult result = mydelegate.BeginInvoke("Thread Param", TestCallback, "Callback Param"); //异步执行完成 string resultstr = mydelegate.EndInvoke(result); } //线程函数 public static string TestMethod(object data) { string datastr = data as string; return datastr; } //异步回调函数 public static void TestCallback(IAsyncResult data) { Console.WriteLine(data.AsyncState); } } }
7. Thread synchronization
1) Atomic operation (Interlocked): All methods perform an atomic read or an write operation.
2) lock() statement: Avoid locking the public type, otherwise the instance will be beyond the scope of code control. Define a private object to lock.
3) Monitor implements thread synchronization
Through Monitor.Enter() and Monitor.Exit(), the exclusive lock is acquired and released. After acquisition, the resource is exclusive and no other threads are allowed to access it.
There is also a TryEnter method, which will not block and wait when the resource cannot be requested. You can set a timeout and return false directly if it cannot be obtained.
4) ReaderWriterLock
When the resource operation involves more reading and less writing, in order to improve resource utilization, the read operation lock is a shared lock. Multiple threads can read the resource concurrently, while the write operation is an exclusive lock. Only Allows one thread to operate.
5) Event class implements synchronization
Event class has two states, termination state and non-termination state. Calling WaitOne in the termination state can request success, and set the time status to the termination state through Set.
1) AutoResetEvent (automatic reset event)
2) ManualResetEvent (manual reset event)
6) Semaphore (Semaphore)
The semaphore is an int variable maintained by the kernel object. When it is 0, the thread blocks , unblocked when greater than 0. When a waiting thread on a semaphore is unblocked, the semaphore count +1.
The thread decreases the semaphore by 1 through WaitOne and increases the semaphore by 1 through Release. It is very simple to use.
7) Mutex (Mutex)
Exclusive resource, usage is similar to Semaphore.
8) Cross-process synchronization
System-level synchronization can be achieved by setting the name of the synchronization object. Different applications identify different synchronization objects through the name of the synchronization object.