Sunday 7 July 2013

Multi Threading in C#

Multi threading is managed internally by a thread scheduler, a function the CLR typically delegates to the operating system. A thread scheduler ensures all active threads are allocated appropriate execution time, and that threads that are waiting or blocked (for instance, on an exclusive lock or on user input) do not consume CPU time. 

On a single-processor computer, a thread scheduler performs time slicingrapidly switching execution between each of the active threads. Under Windows, a time-slice is typically in the tens-of milliseconds region—much larger than the CPU overhead in actually switching context between one thread and another (which is typically in the few-microseconds region). 

On a multi-processor computer, multithreading is implemented with a mixture of time-slicing and genuine concurrency, where different threads run code simultaneously on different CPUs. It’s almost certain there will still be some time-slicing, because of the operating system’s need to service


A multithreaded application allows you to run several threads, each thread running in its own process.  So theoretically you can run step 1 in one thread and at the same time run step 2 in another thread.  At the same time you could run step 3 in its own thread, and even step 4 in its own thread.  Hence step 1, step 2, step 3, and step 4 would run concurrently.  Theoretically, if all four steps took about the same time, you could finish your program in a quarter of the time

Let we see a Real time example : 

There are four persons staying in a same house, They thought they have to go outing so they decide to do there work and finish in parallel.



In this example 3,4,5,6 are the persons name (i.e Thread Id) . Start there work at the same time in parallel and finished Start time : "09:59:54.46293557

Each person have the Two work to finish. each of them to wait for all to finish there task then all of them went outing .


    class Program
    {
        static void Main(string[] args)
        {
            int persons 4;
             ManualResetEvent resetEvent = new ManualResetEvent(false);
            int toProcess = persons;
              
            for (int i = 0; i < persons; i++)
            {
                new Thread(delegate()
                {                  
                        Console.WriteLine("\nTime{1},Person :{0}",
                                   Thread.CurrentThread.ManagedThreadId,DateTime.Now.TimeOfDay);
                        Console.WriteLine();
                       for(int a=0;a<2;a++)
                        Console.WriteLine(" Person {0} : Work {1} Processed",
                                  Thread.CurrentThread.ManagedThreadId,a+1);
                 
                    // If we're the last thread, signal
                    if (Interlocked.Decrement(ref toProcess) == 0)
                        resetEvent.Set();
                  
                }).Start();
            }
  
            // Wait for workers.
            resetEvent.WaitOne();
            Console.WriteLine();
            Console.WriteLine("All person Finished there work.");
            Console.Read();
        }
 
    }

In the Above example Interlocked.Decrement(ref toProcess) == 0 will decrement the variable for each person finish there work,this will decrement the variable value by one , this is shared variable . so at the last person finished his work toProcess  variable value become Zero .

When all thread are started resetEvent.WaitOne(); will wait for all thread to finish 

This will clearly explain about the multi threading .



1 comment: