Uploaded image for project: 'Jenkins'
  1. Jenkins
  2. JENKINS-48882

Build Executor number is not consistent with EXECUTOR_NUMBER

    Details

    • Similar Issues:

      Description

      Build Executor display number is not consistent with the number is provided by EXECUTOR_NUMBER variable.
      Display number should always by EXECUTOR_NUMBER+1 but it randomly shows different values.

      When using 4-5 executors per node (master included) seemed to be easier to replicate but not 100%.

      It seems to only be an issue on the display side.
      When using EXECUTOR_NUMBER to separate folders of maven repository artifacts, to build in parallel, the value is very important to avoid "cross contamination".
      We also use this variable to avoid constant download artifacts on every build, by mounting /var/lib/jenkins/.m2/repository/EXECUTOR_NUMBER inside a container.

      Jenkinsfile:

      node('master') {
          stage('on master') {
              echo EXECUTOR_NUMBER
              sleep 30
              sh 'echo ${EXECUTOR_NUMBER}'
          }
      }
      
      node('docker') {
          stage('on slave') {
              echo EXECUTOR_NUMBER
              sleep 30
              sh 'echo ${EXECUTOR_NUMBER}'
          }
      }
      

      Not sure if there is some way to get the Build Executor display number, so to compare, I took a screenshot from the nodes page, during the build.

        Attachments

          Issue Links

            Activity

            Hide
            p4karl Karl Wirth added a comment -

            If we assume that the executor defines which name is used for the workspace directory then this is causing similar problems for p4-plugin. The number appended to the directory for each executor (e.g. job-name@1) is not consistent with EXECUTOR_NUMBER. This makes it almost impossible to stop contamination of workspaces when concurrent builds occur. Increasing the priority. We see this on 2.138.2.

             

            Test Details

            Freestyle (p4-jenkins terminology) job. Build on slave only (unix slave with 4 executors, Using "_" instead of "@" for workspace separator). Concurrent jobs allowed.

            Simple sync of stuff, build script is:

               echo "p4_client=$P4_CLIENT"
               echo "executor_number=$EXECUTOR_NUMBER"
               echo "workspace=$WORKSPACE"
               # sleep to see how concurrent builds behave
               sleep 10

            Conclusion: no correlation between EXECUTOR_NUMBER and workspace directory.

            Especially of note:

                build 4 has executor 2 and syncs to workspace/stream-simple-ant_4
                build 5 has executor 2 and syncs to workspace/stream-simple-ant_2

            Test is to "build now" 4 times in quick succession so that all are running at the same time.

            build 1:
            p4_client=jenkins-test-p4-u15.local-stream-simple-ant-1
            executor_number=1
            workspace=/jenkins-slave/workspace/stream-simple-ant

            build 2:
            p4_client=jenkins-test-p4-u15.local-stream-simple-ant-3
            executor_number=3
            workspace=/jenkins-slave/workspace/stream-simple-ant_2

            build 3:
            p4_client=jenkins-test-p4-u15.local-stream-simple-ant-0
            executor_number=0
            workspace=/jenkins-slave/workspace/stream-simple-ant_3

            build 4:
            p4_client=jenkins-test-p4-u15.local-stream-simple-ant-2
            executor_number=2
            workspace=/jenkins-slave/workspace/stream-simple-ant_4

            Wait 5, minutes build 5:
            p4_client=jenkins-test-p4-u15.local-stream-simple-ant-2
            executor_number=2
            workspace=/jenkins-slave/workspace/stream-simple-ant

            Show
            p4karl Karl Wirth added a comment - If we assume that the executor defines which name is used for the workspace directory then this is causing similar problems for p4-plugin. The number appended to the directory for each executor (e.g. job-name@1) is not consistent with EXECUTOR_NUMBER. This makes it almost impossible to stop contamination of workspaces when concurrent builds occur. Increasing the priority. We see this on 2.138.2.   Test Details Freestyle (p4-jenkins terminology) job. Build on slave only (unix slave with 4 executors, Using "_" instead of "@" for workspace separator). Concurrent jobs allowed. Simple sync of stuff, build script is:    echo "p4_client=$P4_CLIENT"    echo "executor_number=$EXECUTOR_NUMBER"    echo "workspace=$WORKSPACE"    # sleep to see how concurrent builds behave    sleep 10 Conclusion: no correlation between EXECUTOR_NUMBER and workspace directory. Especially of note:     build 4 has executor 2 and syncs to workspace/stream-simple-ant_4     build 5 has executor 2 and syncs to workspace/stream-simple-ant_2 Test is to "build now" 4 times in quick succession so that all are running at the same time. build 1: p4_client=jenkins-test-p4-u15.local-stream-simple-ant-1 executor_number=1 workspace=/jenkins-slave/workspace/stream-simple-ant build 2: p4_client=jenkins-test-p4-u15.local-stream-simple-ant-3 executor_number= 3 workspace=/jenkins-slave/workspace/stream-simple-ant_ 2 build 3: p4_client=jenkins-test-p4-u15.local-stream-simple-ant-0 executor_number= 0 workspace=/jenkins-slave/workspace/stream-simple-ant_ 3 build 4: p4_client=jenkins-test-p4-u15.local-stream-simple-ant-2 executor_number= 2 workspace=/jenkins-slave/workspace/stream-simple-ant_ 4 Wait 5, minutes build 5: p4_client=jenkins-test-p4-u15.local-stream-simple-ant-2 executor_number= 2 workspace=/jenkins-slave/workspace/stream-simple-ant

              People

              • Assignee:
                Unassigned
                Reporter:
                ncosta Nuno Costa
              • Votes:
                1 Vote for this issue
                Watchers:
                6 Start watching this issue

                Dates

                • Created:
                  Updated: