Java Concurrent Programming — deep understanding of thread pool

Time:2021-1-12

1. Thread pool

Thread pool refers to managementA set of isomorphic worker threadsResource pool for.

1.1 benefits of using thread poolJava Concurrent Programming -- deep understanding of thread pool

  • Reduce resource consumption。 By reusing the created threads, the consumption caused by thread creation and destruction can be reduced.
  • Improve response speed。 When the task arrives, the task can be executed immediately without waiting for the thread to be created.
  • Improve the manageability of threads。 Threads are scarce resources, if unlimited creation, it will not only consume system resources, but also reduce the stability of the system, using thread pool can be unified allocation, tuning and monitoring.

1.2 executor framework

First of all, look at the source code of the executor, that is, the executor is a core basic interface, in which there is only the execute method

public interface Executor {

    /**
     * Executes the given command at some time in the future.  The command
     * may execute in a new thread, in a pooled thread, or in the calling
     * thread, at the discretion of the {@code Executor} implementation.
     *
     * @param command the runnable task
     * @throws RejectedExecutionException if this task cannot be
     * accepted for execution
     * @throws NullPointerException if command is null
     */
    void execute(Runnable command);
}

1.3 three parts of executor framework

  • Mission(Runnable /Callable)

    Implementation of tasksRunnableInterfaceorCallableInterface

  • Task execution(Executor)

    The core interface of task execution mechanismExecutor, and inherited fromExecutorInterfaceExecutorServiceInterface.ThreadPoolExecutorandScheduledThreadPoolExecutorThese two key classes implement theExecutorservice interface

There are a lot of underlying class relationships mentioned here, but actually we need to pay more attention to themThreadPoolExecutorThis class, this class in our actual use of thread pool process, the use frequency is very high.

Java Concurrent Programming -- deep understanding of thread pool

  • Results of asynchronous computation(Future)

FutureInterface andFutureInterface implementation classFutureTaskClass can represent the result of asynchronous computation.

When we putRunnableInterfaceorCallableInterfaceThe implementation class of theThreadPoolExecutororScheduledThreadPoolExecutorImplementation. callsubmit()MethodFutureTaskobject

1.4 use of executor framework

Java Concurrent Programming -- deep understanding of thread pool

  1. The main thread first creates the implementationRunnableperhapsCallableThe task object of the interface.
  2. To create a complete implementationRunnable/CallableThe object of the interface is given directly to theExecutorServiceimplement: ExecutorService.execute(Runnable command))Or you can putRunnableObject orCallableObject submitted toExecutorServiceImplementation(ExecutorService.submit(Runnable task)orExecutorService.submit(Callable <T> task))。
  3. If implementedExecutorService.submit(…)ExecutorServiceAn implementation is returnedFutureThe object of the interface(we just mentioned implementationexecute()Methods andsubmit()The difference between methods,submit()A ‘futuretask object is returned.
  4. The main thread can executeFutureTask.get()Method to wait for task execution to complete.

    The main thread can also executeFutureTask.cancel(boolean mayInterruptIfRunning)To cancel the execution of this task.

1.5 ThreadPoolExecutor class analysis

ThreadPoolExecutor is the core class of executor framework.

Let’s take a look at the core constructor of ThreadPoolExecutor

public ThreadPoolExecutor(int corePoolSize, //The number of core threads defines the minimum number of threads that can run at the same time.
                              int maximumPoolSize,//Maximum number of threads: when the tasks stored in the queue reach the capacity of the queue, the number of threads that can run at the same time becomes the maximum number of threads.
                              long keepAliveTime,//When the number of threads is larger than the number of core threads, the maximum survival time of redundant idle threads
                              TimeUnit unit,//Time unit
                              BlockingQueue<Runnable> workQueue,//Task queue
                              ThreadFactory threadFactory,//Thread factory
                              RejectedExecutionHandler handler) {//Rejection strategy
        if (corePoolSize < 0 ||
            maximumPoolSize <= 0 ||
            maximumPoolSize < corePoolSize ||
            keepAliveTime < 0)
            throw new IllegalArgumentException();
        if (workQueue == null || threadFactory == null || handler == null)
            throw new NullPointerException();
        this.acc = System.getSecurityManager() == null ?
                null :
                AccessController.getContext();
        this.corePoolSize = corePoolSize;
        this.maximumPoolSize = maximumPoolSize;
        this.workQueue = workQueue;
        this.keepAliveTime = unit.toNanos(keepAliveTime);
        this.threadFactory = threadFactory;
        this.handler = handler;
    }

Java Concurrent Programming -- deep understanding of thread poolThreadPoolExecutor three most important parametersJava Concurrent Programming -- deep understanding of thread pool

Java Concurrent Programming -- deep understanding of thread pool corePoolSize: number of core threads number of threads defines the minimum number of threads that can run simultaneously.

Java Concurrent Programming -- deep understanding of thread pool maximumPoolSize: when the tasks stored in the queue reach the capacity of the queue, the number of threads that can run at the same time becomes the maximum.

Java Concurrent Programming -- deep understanding of thread pool workQueue: when a new task comes, it will first determine whether the number of currently running threads reaches the number of core threads, if so, the new task will be stored in the queue.

ThreadPoolExecutor other common parameters:

  • Keepalivetime: when the number of threads in the thread pool is greater than corepoolsize, if no new tasks are submitted, threads outside the core thread will not be destroyed immediately, but will wait until the waiting time exceeds keepalivetime;
  • Unit: the time unit of the keepalivetime parameter.
  • Threadfactory: used by the executor when creating a new thread.
  • Handler: saturation strategy.Java Concurrent Programming -- deep understanding of thread pool

    Threadpooltaskexecutor saturation policy definition: if the number of currently running threads reaches the maximum number and the queue is full of tasks, threadpooltaskexecutor defines some policies:

    1. ThreadPoolExecutor.AbortPolicy : throws a rejectedexecutionexception to reject the processing of a new task. This is the default rejection policy.
    2. ThreadPoolExecutor.CallerRunsPolicy : call to execute its own thread running task, that is, directly run the rejected task in the thread calling the execute method. If the executing program is closed, the task will be discarded. Therefore, this strategy will reduce the submission speed of new tasks and affect the overall performance of the program. If your application can tolerate this delay and you require that any task request be executed, you can choose this policy.
    3. ThreadPoolExecutor.DiscardPolicy : do not handle new tasks, just discard them.
    4. ThreadPoolExecutor.DiscardOldestPolicy : this policy discards the earliest outstanding task requests.

Understand the above parameters through the following pictures

Java Concurrent Programming -- deep understanding of thread pool

1.6 thread pool creation method

1. Create by using the constructor of ThreadPoolExecutor!Java Concurrent Programming -- deep understanding of thread pool

Java Concurrent Programming -- deep understanding of thread pool

2. Create by using static factory method of executors!Java Concurrent Programming -- deep understanding of thread pool

public static ExecutorService newFixedThreadPool(int nThreads) {
        return new ThreadPoolExecutor(nThreads, nThreads,
                                      0L, TimeUnit.MILLISECONDS,
                                      new LinkedBlockingQueue<Runnable>());
    }
public static ExecutorService newSingleThreadExecutor() {
        return new FinalizableDelegatedExecutorService
            (new ThreadPoolExecutor(1, 1,
                                    0L, TimeUnit.MILLISECONDS,
                                    new LinkedBlockingQueue<Runnable>()));
    }
public static ExecutorService newCachedThreadPool() {
        return new ThreadPoolExecutor(0, Integer.MAX_VALUE,
                                      60L, TimeUnit.SECONDS,
                                      new SynchronousQueue<Runnable>());
    }
 public ScheduledThreadPoolExecutor(int corePoolSize,
                                       ThreadFactory threadFactory) {
        super(corePoolSize, Integer.MAX_VALUE, 0, NANOSECONDS,
              new DelayedWorkQueue(), threadFactory);
    }

Through the above static factory method, we can find that different threadpoolexecutors have their own disadvantages

  • Fixedthreadpool and singlethreadexecutor: the queue length allowed for requests is Integer.MAX_ Value, which may accumulate a large number of requests, resulting in oom.
  • Cachedthreadpool and scheduledthreadpool: the number of threads allowed to be created is Integer.MAX_ Value, may create a large number of threads, resulting in oom.

ThreadPoolExecutor class execution processJava Concurrent Programming -- deep understanding of thread pool

Java Concurrent Programming -- deep understanding of thread pool

This work adoptsCC agreementReprint must indicate the author and the link of this article