Uploaded image for project: 'Jenkins'
  1. Jenkins
  2. JENKINS-58085

BlueOcean UI stuck in "Waiting for run to start"

    Details

    • Type: Bug
    • Status: Reopened (View Workflow)
    • Priority: Blocker
    • Resolution: Unresolved
    • Component/s: blueocean-plugin
    • Labels:
      None
    • Environment:
      Jenkins 2.180, BlueOcean 1.17.0
    • Similar Issues:
    • Released As:
      blue-ocean 1.19.0

      Description

      We recently upgraded BlueOcean from 1.16.0 to 1.17.0 and we started observing a weird behaviour in the BlueOcean pipeline UI.

      Frequently (not always) the pipeline UI stops updating the progress while the pipeline is running and the UI is stuck at "Waiting for run to start" (see attached screenshot). When it happens, it does not recover until the pipeline execution completes: once completed, the UI is correctly updated (all steps are green).

      We've also noticed that - when happens - the underlying requests sent by the browser to the endpoint https://jenkins.DOMAIN/blue/rest/organizations/jenkins/pipelines/PROJECT/branches/master/runs/ID/nodes/ID/steps/ always return an empty array "[]" instead of the expected array of steps. On the contrary, during the execution of the pipeline, if we look at the "Console Output" (old Jenkins UI) we can correctly see the progress of the pipeline even when the BlueOcean UI is stuck at "Waiting for run to start".

      This issue looks disappear if we rollback all BlueOcean plugins from 1.17.0 to 1.16.0.

        Attachments

        1. jenkins_build1.mov
          970 kB
        2. jenkins_build1.png
          jenkins_build1.png
          89 kB
        3. jenkins_build2.mov
          1.07 MB
        4. jenkins_build2.png
          jenkins_build2.png
          96 kB
        5. screenshot_2019-06-18_at_14.52.11.png
          screenshot_2019-06-18_at_14.52.11.png
          116 kB
        6. Screenshot 2019-10-17 at 10.08.17.png
          Screenshot 2019-10-17 at 10.08.17.png
          16 kB

          Issue Links

            Activity

            pracucci Marco Pracucci created issue -
            Hide
            habanero Diego Rodriguez added a comment -

            I'm running into this same exact issue as well after upgrading to BlueOcean 1.17.0 (currently on Jenkins 2.181)

             

             

            Show
            habanero Diego Rodriguez added a comment - I'm running into this same exact issue as well after upgrading to BlueOcean 1.17.0 (currently on Jenkins 2.181)    
            Hide
            ppepe Pietro Pepe added a comment -

            Also in our infrastructure we get the same issue using BlueOcean 1.17.0 (on Jenkins 2.176.1)

            Show
            ppepe Pietro Pepe added a comment - Also in our infrastructure we get the same issue using BlueOcean 1.17.0 (on Jenkins 2.176.1)
            ppepe Pietro Pepe made changes -
            Field Original Value New Value
            Assignee Pietro Pepe [ ppepe ]
            ppepe Pietro Pepe made changes -
            Assignee Pietro Pepe [ ppepe ]
            Hide
            jonathanb1 Jonathan B added a comment - - edited

            We're also experiencing this after an upgrade to BlueOcean 1.17.0 and Jenkins 2.176.1. I filed https://issues.jenkins-ci.org/browse/JENKINS-58145 about it.

            I tried downgrading BlueOcean  back to 1.16.0 but that didn't actually help. I also tried downgrading all the pipeline-model* plugins from 1.39 back to 1.38 (which was the version we were on when this was working properly), but that also did not help.

            From my hasty testing, my best guess right now is that the issue was introduced by one of workflow-step-api:2.20 (2.19 was fine), workflow-durable-task-step:2.31 (2.30 was fine), or workflow-cps:2.70 (2.69 was fine). Those are tricky to downgrade because there is some complicated web of dependencies that require us to be on the latest of all of them.

            Since you mention that downgrading BlueOcean to 1.16 fixed it for you, I will have to try that again.

            Show
            jonathanb1 Jonathan B added a comment - - edited We're also experiencing this after an upgrade to BlueOcean 1.17.0 and Jenkins 2.176.1. I filed https://issues.jenkins-ci.org/browse/JENKINS-58145  about it. I tried downgrading BlueOcean  back to 1.16.0 but that didn't actually help. I also tried downgrading all the pipeline-model* plugins from 1.39 back to 1.38 (which was the version we were on when this was working properly), but that also did not help. From my hasty testing, my best guess right now is that the issue was introduced by one of workflow-step-api:2.20 (2.19 was fine), workflow-durable-task-step:2.31 (2.30 was fine), or workflow-cps:2.70 (2.69 was fine). Those are tricky to downgrade because there is some complicated web of dependencies that require us to be on the latest of all of them. Since you mention that downgrading BlueOcean to 1.16 fixed it for you, I will have to try that again.
            Hide
            pracucci Marco Pracucci added a comment -

            Jonathan B Downgrading to BlueOcean 1.16 worked for us. Pay attention that - to us - to correctly downgrade we had to uninstall all BlueOcean 1.17 plugins first, and then re-installed all of them (one by one) on 1.16 version without using the "blueocean:1.16" meta package which was re-installing the 1.17 version of plugins. Don't know if there's an easier way, or this downgrade issue was due to our setup (we didn't investigate much cause we were buried with work).

            Show
            pracucci Marco Pracucci added a comment - Jonathan B Downgrading to BlueOcean 1.16 worked for us. Pay attention that - to us - to correctly downgrade we had to uninstall all BlueOcean 1.17 plugins first, and then re-installed all of them (one by one) on 1.16 version without using the "blueocean:1.16" meta package which was re-installing the 1.17 version of plugins. Don't know if there's an easier way, or this downgrade issue was due to our setup (we didn't investigate much cause we were buried with work).
            jonathanb1 Jonathan B made changes -
            Link This issue is duplicated by JENKINS-58145 [ JENKINS-58145 ]
            Hide
            jonathanb1 Jonathan B added a comment - - edited

            Marco Pracucci thank you. I revisited this and downgrading to Blue Ocean 1.16 did resolve the issue here as well. When I tried initially, I had downgraded only the metapackage `blueocean`, but all of the subpackages were still on 1.17.

            Show
            jonathanb1 Jonathan B added a comment - - edited Marco Pracucci thank you. I revisited this and downgrading to Blue Ocean 1.16 did resolve the issue here as well. When I tried initially, I had downgraded only the metapackage `blueocean`, but all of the subpackages were still on 1.17.
            Hide
            reinholdfuereder Reinhold Füreder added a comment - - edited

            We are also experiencing this huge problem (but not all the time though!?) – but are hesitating to downgrade, as 1.17 contains one really, really helpful new feature: JENKINS-39203 (Devin Nusbaum Based on the BlueOcean changelog one naively assumes that this issue is actually caused by JENKINS-39203 => therefore I naively linked this issues)

            Show
            reinholdfuereder Reinhold Füreder added a comment - - edited We are also experiencing this huge problem (but not all the time though!?) – but are hesitating to downgrade, as 1.17 contains one really, really helpful new feature: JENKINS-39203 ( Devin Nusbaum Based on the BlueOcean changelog one naively assumes that this issue is actually caused by JENKINS-39203 => therefore I naively linked this issues)
            Hide
            toho Tobias Honacker added a comment -

            Same here. Blue Ocean is not working.

            Jenkins ver. 2.165 and Blue Ocean 1.17

             

            Show
            toho Tobias Honacker added a comment - Same here. Blue Ocean is not working. Jenkins ver. 2.165 and Blue Ocean 1.17  
            Hide
            rmorrise Russell Morrisey added a comment -

            I don't know if it's related, but we started seeing this message show up randomly in our Jenkins build log:
            sh: line 1: 3550 Terminated sleep 3
            I could not trace the "sleep 3" command back to any script in our build pipeline.

             

            Show
            rmorrise Russell Morrisey added a comment - I don't know if it's related, but we started seeing this message show up randomly in our Jenkins build log: sh: line 1: 3550 Terminated sleep 3 I could not trace the "sleep 3" command back to any script in our build pipeline.  
            reinholdfuereder Reinhold Füreder made changes -
            Link This issue is caused by JENKINS-39203 [ JENKINS-39203 ]
            Hide
            dnusbaum Devin Nusbaum added a comment - - edited

            Based on the BlueOcean changelog one naively assumes that this issue is actually caused by JENKINS-39203

            I'd be surprised if that was related, any problems with that change should just cause the wrong result status for a single stage, so the fact that in the description the API is returning an empty array for the steps in a stage makes me think something else is broken. Looking at the changelog, I suspect it is related to the fix for JENKINS-53816, especially given some of the comments on that ticket mentioning that it might have made things worse in some cases.

            Russell Morrisey I think you are running into JENKINS-55308, which is unrelated as far as I know.

            Show
            dnusbaum Devin Nusbaum added a comment - - edited Based on the BlueOcean changelog one naively assumes that this issue is actually caused by JENKINS-39203 I'd be surprised if that was related, any problems with that change should just cause the wrong result status for a single stage, so the fact that in the description the API is returning an empty array for the steps in a stage makes me think something else is broken. Looking at the changelog , I suspect it is related to the fix for JENKINS-53816 , especially given some of the comments on that ticket mentioning that it might have made things worse in some cases. Russell Morrisey I think you are running into  JENKINS-55308 , which is unrelated as far as I know.
            dnusbaum Devin Nusbaum made changes -
            Link This issue relates to JENKINS-53816 [ JENKINS-53816 ]
            Hide
            reinholdfuereder Reinhold Füreder added a comment -

            Devin Nusbaum Sorry, I guess you are right => I'll adapt the issue links

            Show
            reinholdfuereder Reinhold Füreder added a comment - Devin Nusbaum Sorry, I guess you are right => I'll adapt the issue links
            reinholdfuereder Reinhold Füreder made changes -
            Link This issue is caused by JENKINS-53816 [ JENKINS-53816 ]
            reinholdfuereder Reinhold Füreder made changes -
            Link This issue is caused by JENKINS-39203 [ JENKINS-39203 ]
            Hide
            elliotg Elliot Graebert added a comment - - edited

            I'm also running into the same issue, which I also commented on here: https://issues.jenkins-ci.org/browse/JENKINS-49131

            This issue is very frustrating, as it makes the entire CI pipeline look like it's hung.

             

            Jenkins 2.187 and Blue Ocean 1.18.0

            Show
            elliotg Elliot Graebert added a comment - - edited I'm also running into the same issue, which I also commented on here:  https://issues.jenkins-ci.org/browse/JENKINS-49131 This issue is very frustrating, as it makes the entire CI pipeline look like it's hung.   Jenkins 2.187 and Blue Ocean 1.18.0
            Hide
            rmorrise Russell Morrisey added a comment -

            We updated our plugins over the weekend, but we are still unable to see the input stage to approve PROD deployments.

            This is a show-stopper for us. We are dropping all usage of Blue Ocean (except for new pipeline setup) until it's resolved.

             

            Show
            rmorrise Russell Morrisey added a comment - We updated our plugins over the weekend, but we are still unable to see the input stage to approve PROD deployments. This is a show-stopper for us. We are dropping all usage of Blue Ocean (except for new pipeline setup) until it's resolved.  
            Hide
            timewalker75a Dmitry Seryogin added a comment -

            With jenkins 2.164.1, after updating to 1.17.0 and 1.18 .0 subsequently, we are facing the same issue. We have multiple product versioning and deployment pipelines with when-conditions that alter the stage behaviour. Most pipelines have now been affected byt this problem where stages after when-conditon stages just appear dead with the 'waiting' message.

            Show
            timewalker75a Dmitry Seryogin added a comment - With jenkins 2.164.1, after updating to 1.17.0 and 1.18 .0 subsequently, we are facing the same issue. We have multiple product versioning and deployment pipelines with when-conditions that alter the stage behaviour. Most pipelines have now been affected byt this problem where stages after when-conditon stages just appear dead with the 'waiting' message.
            Hide
            t3rm1 Unknown Unknown added a comment -

            Same here. No input steps are shown. We are not able to finish a pipeline. This is a desaster.

            Show
            t3rm1 Unknown Unknown added a comment - Same here. No input steps are shown. We are not able to finish a pipeline. This is a desaster.
            t3rm1 Unknown Unknown made changes -
            Priority Major [ 3 ] Blocker [ 1 ]
            Hide
            t3rm1 Unknown Unknown added a comment -

            After reading through all the comments I came to the conclusion that Blue Ocean must be dead and abandoned. The issue is over 2 months old and breaks Blue Ocean completly. How is it possible that nobody has fixed this the minute after it was reported.

            Show
            t3rm1 Unknown Unknown added a comment - After reading through all the comments I came to the conclusion that Blue Ocean must be dead and abandoned. The issue is over 2 months old and breaks Blue Ocean completly. How is it possible that nobody has fixed this the minute after it was reported.
            dnusbaum Devin Nusbaum made changes -
            Remote Link This issue links to "jenkinsci/blueocean-plugin#2017 (Web Link)" [ 23420 ]
            dnusbaum Devin Nusbaum made changes -
            Assignee Devin Nusbaum [ dnusbaum ]
            dnusbaum Devin Nusbaum made changes -
            Status Open [ 1 ] In Progress [ 3 ]
            Hide
            dnusbaum Devin Nusbaum added a comment -

            I filed a PR that should fix at least some variants of this issue: https://github.com/jenkinsci/blueocean-plugin/pull/2017. I think the main ways to hit this bug are when the Pipeline's execution path in terms of steps/stages changes from one run to the next (for example if a when condition is activated in one build but not in the next, or perhaps if you have a Scripted Pipeline that does something like when using Groovy, or if you changed the Jenkinsfile manually). If anyone has simple reproducer (just a Jenkinsfile that runs without needing to configure anything special in Jenkins) that does not involve any of those things (or even if it does involve those things), I would be interested to see it to check if my patch fixes it or not.

            Show
            dnusbaum Devin Nusbaum added a comment - I filed a PR that should fix at least some variants of this issue: https://github.com/jenkinsci/blueocean-plugin/pull/2017 . I think the main ways to hit this bug are when the Pipeline's execution path in terms of steps/stages changes from one run to the next (for example if a when condition is activated in one build but not in the next, or perhaps if you have a Scripted Pipeline that does something like when using Groovy, or if you changed the Jenkinsfile manually). If anyone has simple reproducer (just a Jenkinsfile that runs without needing to configure anything special in Jenkins) that does not involve any of those things (or even if it does involve those things), I would be interested to see it to check if my patch fixes it or not.
            dnusbaum Devin Nusbaum made changes -
            Status In Progress [ 3 ] In Review [ 10005 ]
            Hide
            dnusbaum Devin Nusbaum added a comment -

            Again, any additional information that anyone has on specific Pipelines that reproduce the issue would be welcome so that I can investigate how my proposed changes will affect them.

            Show
            dnusbaum Devin Nusbaum added a comment - Again, any additional information that anyone has on specific Pipelines that reproduce the issue would be welcome so that I can investigate how my proposed changes will affect them.
            Hide
            elliotg Elliot Graebert added a comment -

            Hey Devin,

            So we were able to consistently reproduce the issue on a single Pipeline (it would fail in this way every time). We weren't able to reproduce by running the pipeline elsewhere, which is weird. We deleted the Job and all history, and then the issue went away. It sounds like there may be some connection with this bug related to something that is stored persistently. Which I know isn't super helpful.

            I'm keeping my eye out for a future occurrence of the issue. If we see it again, what can I grab out of the persistent data that would effect the Blue Ocean UI?

            Show
            elliotg Elliot Graebert added a comment - Hey Devin, So we were able to consistently reproduce the issue on a single Pipeline (it would fail in this way every time). We weren't able to reproduce by running the pipeline elsewhere, which is weird. We deleted the Job and all history, and then the issue went away. It sounds like there may be some connection with this bug related to something that is stored persistently. Which I know isn't super helpful. I'm keeping my eye out for a future occurrence of the issue. If we see it again, what can I grab out of the persistent data that would effect the Blue Ocean UI?
            Hide
            dnusbaum Devin Nusbaum added a comment - - edited

            Elliot Graebert Blue Ocean tries to combine in-progress builds with the last successful build of the project so that it can try to predict what the graph will look like, rather than only showing the progress based on the current build. Problems happen when the flow of execution from the last successful build is slightly different than the in-progress build, but similar enough that Blue Ocean still tries to merge the graphs (for example because of a when whose condition was true in the last build but not this build).

            If you see it again, the minimum data to include would be your Jenkinsfile (generally only the overall structure matters, the exact steps you run are not important). Ideally you would be able to upload the build folders of the last successful build and the current build ($JENKINS_HOME/jobs/$JOB_NAME/builds/$BUILD_NUMBER) so we can compare the FlowNode}}s in their {{workflow folders which contains the exact data that Blue Ocean is using to create the visualization.

            Show
            dnusbaum Devin Nusbaum added a comment - - edited Elliot Graebert Blue Ocean tries to combine in-progress builds with the last successful build of the project so that it can try to predict what the graph will look like, rather than only showing the progress based on the current build. Problems happen when the flow of execution from the last successful build is slightly different than the in-progress build, but similar enough that Blue Ocean still tries to merge the graphs (for example because of a when whose condition was true in the last build but not this build). If you see it again, the minimum data to include would be your Jenkinsfile (generally only the overall structure matters, the exact steps you run are not important). Ideally you would be able to upload the build folders of the last successful build and the current build ( $JENKINS_HOME/jobs/$JOB_NAME/builds/$BUILD_NUMBER ) so we can compare the FlowNode}}s in their {{workflow folders which contains the exact data that Blue Ocean is using to create the visualization.
            Hide
            timewalker75a Dmitry Seryogin added a comment - - edited

            Whenever below pipeline (some steps taken out of earlier stages) that we use to deploy releases into a target environment runs and Delay is not 0, the Wait step completes successfully but the following Deploy stage then gets affected by the bug in question. If Delay is skipped due to the when-condition (be it a re-run or a 0 min delay), the pipeline then works as intended in BlueOcean.

            Edit: The problem surfaces consistently on each and every run against the test envrionments where 5min delay is enforced after having deployed to predelivery (where delay is 0). 

            pipeline {
            	agent none
            	parameters {
            		gitParameter name: 'release',
            			type: 'PT_TAG',
            			branchFilter: 'origin/(.*)',
            			tagFilter: 'v*',
            			sortMode: 'DESCENDING_SMART',
            			selectedValue: 'TOP',
            			useRepository: 'repo'
            			quickFilterEnabled: true,
            			listSize: '5',
            			description: 'git tag'
            
            		choice name: 'environment',
            			choices: getAdminNodes('admin'),
            			description: 'target deployment environment'
            
            		choice name: 'delay',
            			choices: ['0 min', '1 min', '5 min', '10 min', '15 min'],
            			description: 'delay before starting the deploy phase'
            
            		choice name: 'delta_run',
            			choices: ['Yes','No'],
            			description: 'install only the delta compared to prior release)'
            
            		booleanParam name: 'deploy_validation',
            			defaultValue: false,
            			description: 'issue additional sanity checks during deploy phase'
            	}
            	options {
            		buildDiscarder(logRotator(numToKeepStr: '5', daysToKeepStr: '30'))
            		timeout(time: 2, unit: 'HOURS')
            		skipStagesAfterUnstable()
            		timestamps()
            		skipDefaultCheckout true
            	}
            	stages {
            		stage  ('SCM') {
            			agent { label 'rel-sbox-pup-a01' }
            			steps {
            				cleanWs notFailBuild: true
            				checkout([
            					$class: 'GitSCM',
            					branches: [[  name: "refs/tags/${params.release}" ]],
            					userRemoteConfigs: [[
            						credentialsId: 'id',
            						url: 'repo'
            					]]
            				])
            			}
            		}
            		stage ('Prepare') {
            			agent { label params.environment }
            			steps {
            				script {
            					String node = env.environment.split('\\-')[1].toUpperCase()
            					if ('CHI'.equals(node) || 'RHO'.equals(node)) {
            						env.delay = '5 min'
            					} else {
            						env.delay = '0 min'
            						println 'Not a test environment - commence immediate deployment'
            					}
            					readFile('/dsa/versions.ctl').split("\r?\n").each { String line ->
            						if (line.equals(env.release)) {
            							println env.release + ' is a known version for ' + env.environment + ', skip project notification'
            							env.delay = '0 min'
            						}
            					}
            					currentBuild.displayName = "#${BUILD_NUMBER} - " + node + " - ${params.release}"
            				}
            			}
            		}
            		stage ('Package') {
            			agent { label 'rel-sbox-pup-a01' }
            			steps {
            				sh label: 'Gather Artefacts', script:
            				"""
            					if [ -z ${M2_HOME} ]; then
            						echo "Maven not configured, abort further actions"
            						exit 1
            					fi
            					mvn -version
            					env_id=`echo ${environment} | cut -d '-' -f2`
            					case \$env_id in
            						prede) mount_point_id="new";;
            							*) mount_point_id=\$env_id
            					esac
            					mount_point=/dsa-\$mount_point_id
            					${WORKSPACE}/tools/mvn/svc_fetch_delivery.sh \$mount_point
            				"""
            			}
            		}
            		stage ('Preflight') {
            			agent { label params.environment }
            			steps {
            				script {
            					env.DELIVERY_DIR = sh(label: 'Delivery Path',
            						script:
            						"""
            							ver_prefix=`echo ${release} | awk -F'_' '{print \$1 "_" \$2 "_next_"}'`
            							delivery_base="/dsa/infrastructure"
            							if [ ${deploy_validation} = "true" ]; then
            								delivery_dir=\$delivery_base/delivery/${release}
            							else
            								delivery_type="full"
            								if [ ${delta_run} = "Yes" ]; then
            									delivery_type="delta"
            								fi
            								delivery_dir=\$delivery_base/delivery/\$ver_prefix\$delivery_type
            							fi
            							echo \$delivery_dir
            						"""
            						, returnStdout: true
            					).trim()
            				}
            				sh label: 'Gather Prerequisites', script:
            				"""
            					echo ${PWD}
            					cd ${DELIVERY_DIR}
            					${DELIVERY_DIR}/delivery.rec.0-2.prereq.load.sh
            				"""
            			}
            		}
            		stage ('Delay') {
            			agent { label params.environment }
            			steps {
            				sh label: 'Wait', script:
            				"""
            					num=`echo ${delay} | cut -d ' ' -f1`
            					if [ \$num -ne 0 ]; then
            						sleep=\$((num * 60))
            						sleep \$sleep
            					fi
            				"""
            			}
            			when {
            				allOf {
            					not {
            						environment name: 'delay',
            						ignoreCase: true,
            						value: '0 min'
            					}
            					not {
            						isRestartedRun()
            					}
            				}
            			}
            		}
            		stage ('Deploy') {
            			agent { label params.environment }
            			steps {
            				withCredentials([
            				usernamePassword(
            					credentialsId: 'id',
            					passwordVariable: 'FMW_PASSWORD',
            					usernameVariable: 'FMW_USER')
            				]) {
            					sh label: 'Install', script:
            					"""
            						cd ${DELIVERY_DIR}
            						${DELIVERY_DIR}/install.sh
            					"""
            				}
            			}
            		}
            	}
            	post {
            		success {
            			build job: 'ADM.DATA_SYNC', propagate: false
            		}
            	}
            }
            
            Show
            timewalker75a Dmitry Seryogin added a comment - - edited Whenever below pipeline (some steps taken out of earlier stages) that we use to deploy releases into a target environment runs and Delay is not 0, the Wait step completes successfully but the following Deploy stage then gets affected by the bug in question. If Delay is skipped due to the when-condition (be it a re-run or a 0 min delay), the pipeline then works as intended in BlueOcean. Edit: The problem surfaces consistently on each and every run against the test envrionments where 5min delay is enforced after having deployed to predelivery (where delay is 0).  pipeline { agent none parameters { gitParameter name: 'release' , type: 'PT_TAG' , branchFilter: 'origin/(.*)' , tagFilter: 'v*' , sortMode: 'DESCENDING_SMART' , selectedValue: 'TOP' , useRepository: 'repo' quickFilterEnabled: true , listSize: '5' , description: 'git tag' choice name: 'environment' , choices: getAdminNodes( 'admin' ), description: 'target deployment environment' choice name: 'delay' , choices: [ '0 min' , '1 min' , '5 min' , '10 min' , '15 min' ], description: 'delay before starting the deploy phase' choice name: 'delta_run' , choices: [ 'Yes' , 'No' ], description: 'install only the delta compared to prior release)' booleanParam name: 'deploy_validation' , defaultValue: false , description: 'issue additional sanity checks during deploy phase' } options { buildDiscarder(logRotator(numToKeepStr: '5' , daysToKeepStr: '30' )) timeout(time: 2, unit: 'HOURS' ) skipStagesAfterUnstable() timestamps() skipDefaultCheckout true } stages { stage ( 'SCM' ) { agent { label 'rel-sbox-pup-a01' } steps { cleanWs notFailBuild: true checkout([ $class: 'GitSCM' , branches: [[ name: "refs/tags/${params.release}" ]], userRemoteConfigs: [[ credentialsId: 'id' , url: 'repo' ]] ]) } } stage ( 'Prepare' ) { agent { label params.environment } steps { script { String node = env.environment.split( '\\-' )[1].toUpperCase() if ( 'CHI' .equals(node) || 'RHO' .equals(node)) { env.delay = '5 min' } else { env.delay = '0 min' println 'Not a test environment - commence immediate deployment' } readFile( '/dsa/versions.ctl' ).split( "\r?\n" ).each { String line -> if (line.equals(env.release)) { println env.release + ' is a known version for ' + env.environment + ', skip project notification' env.delay = '0 min' } } currentBuild.displayName = "#${BUILD_NUMBER} - " + node + " - ${params.release}" } } } stage ( 'Package' ) { agent { label 'rel-sbox-pup-a01' } steps { sh label: 'Gather Artefacts' , script: """ if [ -z ${M2_HOME} ]; then echo "Maven not configured, abort further actions" exit 1 fi mvn -version env_id=`echo ${environment} | cut -d '-' -f2` case \$env_id in prede) mount_point_id= " new " ;; *) mount_point_id=\$env_id esac mount_point=/dsa-\$mount_point_id ${WORKSPACE}/tools/mvn/svc_fetch_delivery.sh \$mount_point """ } } stage ( 'Preflight' ) { agent { label params.environment } steps { script { env.DELIVERY_DIR = sh(label: 'Delivery Path' , script: """ ver_prefix=`echo ${release} | awk -F '_' '{print \$1 "_" \$2 "_next_" }' ` delivery_base= "/dsa/infrastructure" if [ ${deploy_validation} = " true " ]; then delivery_dir=\$delivery_base/delivery/${release} else delivery_type= "full" if [ ${delta_run} = "Yes" ]; then delivery_type= "delta" fi delivery_dir=\$delivery_base/delivery/\$ver_prefix\$delivery_type fi echo \$delivery_dir """ , returnStdout: true ).trim() } sh label: 'Gather Prerequisites' , script: """ echo ${PWD} cd ${DELIVERY_DIR} ${DELIVERY_DIR}/delivery.rec.0-2.prereq.load.sh """ } } stage ( 'Delay' ) { agent { label params.environment } steps { sh label: 'Wait' , script: """ num=`echo ${delay} | cut -d ' ' -f1` if [ \$num -ne 0 ]; then sleep=\$((num * 60)) sleep \$sleep fi """ } when { allOf { not { environment name: 'delay' , ignoreCase: true , value: '0 min' } not { isRestartedRun() } } } } stage ( 'Deploy' ) { agent { label params.environment } steps { withCredentials([ usernamePassword( credentialsId: 'id' , passwordVariable: 'FMW_PASSWORD' , usernameVariable: 'FMW_USER' ) ]) { sh label: 'Install' , script: """ cd ${DELIVERY_DIR} ${DELIVERY_DIR}/install.sh """ } } } } post { success { build job: 'ADM.DATA_SYNC' , propagate: false } } }
            Hide
            dnusbaum Devin Nusbaum added a comment -

            Dmitry Seryogin Thanks! That looks like the problem I described with when, where the value of the condition changes from run to run, so it should be covered by my patch. I think the simplest reproduction of that kind of issue is a Pipeline like this:

            pipeline {
                stages {
                    stage('First') {
                        steps { sleep 10 }
                    }
                    stage('Second') {
                        when { expression { (currentBuild.number % 0) == 0 } } // Run on even builds, skip on odd builds.
                        steps { sleep 10 }
                    }
                    stage('Third') {
                        steps { sleep 10 }
                    }
                }
            }
            

            The "Second" stage should show the bug on every build after the first one.

            Show
            dnusbaum Devin Nusbaum added a comment - Dmitry Seryogin Thanks! That looks like the problem I described with when , where the value of the condition changes from run to run, so it should be covered by my patch. I think the simplest reproduction of that kind of issue is a Pipeline like this: pipeline { stages { stage( 'First' ) { steps { sleep 10 } } stage( 'Second' ) { when { expression { (currentBuild.number % 0) == 0 } } // Run on even builds, skip on odd builds. steps { sleep 10 } } stage( 'Third' ) { steps { sleep 10 } } } } The "Second" stage should show the bug on every build after the first one.
            Hide
            whatsdevops Angelo Loria added a comment -

            I am seeing this issue with every pipeline run; all pipelines are Bitbucket Branch Source jobs. 

            Pipeline example w/ all details removed. This pipeline is called by the jenkinsfile in the solution.

             

            // this code allows for entire pipeline to be called from jenkinsfile in solution
            def call(body) {
            // evaluate the body block, and collect configuration into the object
            def params = [:]
            body.resolveStrategy = Closure.DELEGATE_FIRST
            body.delegate = params
            body()

            def deployUtils = new DeployUtils(this)
            def gitUtils = new GitUtils(this)
            def jiraUtils = new JiraUtils(this)

            // sets parameters ahead of pipeline being executed
            node('master') {
            stage('Gathering Parameters for Build') {
            switch(env.branch_name)

            Unknown macro: { case ~/hotfix.*/}

            }
            }

            pipeline {
            agent {
            label "${agentLabel}"
            }
            options

            Unknown macro: { timestamps() disableConcurrentBuilds() }

            stages {
            stage('Building sln') {
            }
            stage('Publishing') {
            }
            }
            post {
            always {
            }
            failure {
            }
            success {

            }
            cleanup {
            }
            }
            }
            }
             

            Show
            whatsdevops Angelo Loria added a comment - I am seeing this issue with every pipeline run; all pipelines are Bitbucket Branch Source jobs.  Pipeline example w/ all details removed. This pipeline is called by the jenkinsfile in the solution.   // this code allows for entire pipeline to be called from jenkinsfile in solution def call(body) { // evaluate the body block, and collect configuration into the object def params = [:] body.resolveStrategy = Closure.DELEGATE_FIRST body.delegate = params body() def deployUtils = new DeployUtils(this) def gitUtils = new GitUtils(this) def jiraUtils = new JiraUtils(this) // sets parameters ahead of pipeline being executed node('master') { stage('Gathering Parameters for Build') { switch(env.branch_name) Unknown macro: { case ~/hotfix.*/} } } pipeline { agent { label "${agentLabel}" } options Unknown macro: { timestamps() disableConcurrentBuilds() } stages { stage('Building sln') { } stage('Publishing') { } } post { always { } failure { } success { } cleanup { } } } }  
            Hide
            timewalker75a Dmitry Seryogin added a comment -

             think the simplest reproduction of that kind of issue is a Pipeline like this

            Given you meant modulo 2 rather than 0, yeah, that does reproduce the problem just right - 3rd step is sitting there with 'waiting for run to start' until the step actually completes and the node renders with the green tick.

            Show
            timewalker75a Dmitry Seryogin added a comment -  think the simplest reproduction of that kind of issue is a Pipeline like this Given you meant modulo 2 rather than 0, yeah, that does reproduce the problem just right - 3rd step is sitting there with 'waiting for run to start' until the step actually completes and the node renders with the green tick.
            Hide
            dnusbaum Devin Nusbaum added a comment -

            Blue Ocean 1.19.0 was just released with a fix for at least some aspects of this issue. Please try it out, and if you are still seeing the problem, post a minimal Jenkinsfile that reproduces the behavior you are seeing.

            Show
            dnusbaum Devin Nusbaum added a comment - Blue Ocean 1.19.0 was just released with a fix for at least some aspects of this issue. Please try it out, and if you are still seeing the problem, post a minimal Jenkinsfile that reproduces the behavior you are seeing.
            dnusbaum Devin Nusbaum made changes -
            Status In Review [ 10005 ] Resolved [ 5 ]
            Resolution Fixed [ 1 ]
            Released As blue-ocean 1.19.0
            bbroere Bas Broere made changes -
            Hide
            bbroere Bas Broere added a comment - - edited

            Sorry, not in the position to post a minimal jenkinsfile, but still the problem persists in the pipeline added as attachment.
            Version Blue Ocean: 1.19.0 · Core 2.199 · e743640 · 4th September 2019 01:20 AM

            Show
            bbroere Bas Broere added a comment - - edited Sorry, not in the position to post a minimal jenkinsfile, but still the problem persists in the pipeline added as attachment. Version Blue Ocean: 1.19.0 · Core 2.199 · e743640 · 4th September 2019 01:20 AM
            Hide
            brianjmurrell Brian J Murrell added a comment -

            Can we please have this re-opened.  I can confirm it's still happening here on 1.19.0 also.

            This together with stage-view being pretty crippled means finding logs of currently-running pipelines (i.e. in Pipeline Steps) is pretty cumbersome even for a Jenkins pro.  Pretty much unusable for casual users with it's lack of collapsibility, etc.

            Show
            brianjmurrell Brian J Murrell added a comment - Can we please have this re-opened.  I can confirm it's still happening here on 1.19.0 also. This together with stage-view being pretty crippled means finding logs of currently-running pipelines (i.e. in Pipeline Steps) is pretty cumbersome even for a Jenkins pro.  Pretty much unusable for casual users with it's lack of collapsibility, etc.
            Hide
            dnusbaum Devin Nusbaum added a comment -

            Bas Broere Brian J Murrell Do you have a minimal and/or independent Jenkinsfile that reproduces the issue you are seeing, or can you at least post the Jenkinsfile that you do have? Without more details, there isn't really much we can do.

            Show
            dnusbaum Devin Nusbaum added a comment - Bas Broere Brian J Murrell Do you have a minimal and/or independent Jenkinsfile that reproduces the issue you are seeing, or can you at least post the Jenkinsfile that you do have? Without more details, there isn't really much we can do.
            Hide
            brianjmurrell Brian J Murrell added a comment -

            https://raw.githubusercontent.com/daos-stack/daos/master/Jenkinsfile often exhibits the problem in the Functional/Functional_Hardware stages.

            Show
            brianjmurrell Brian J Murrell added a comment - https://raw.githubusercontent.com/daos-stack/daos/master/Jenkinsfile  often exhibits the problem in the Functional / Functional_Hardware stages.
            Hide
            ian_ookla Ian Wallace-Hoyt added a comment -

            The issue repros nearly 100% of the time using this pipeline when a "buddy build" runs.

            Key details:

            • Jenkins 2.204.1
            • Blue Ocean 1.21.0
            • Amazon EC2 plugin 1.47
            • ec2-android-toolchain: this is an ec2 instance provisioned using the ec2 plugin
            • Issue repros when a PR is opened, this causing
              • "Release: build" stage is skipped
              • Stages "Buddy: build apks" and "Buddy: jvm verify" execute in parallel
            • "Buddy: build apks" stage always shows "Queued: Waiting for run to start" in BlueOcean, even though it runs and completes.
            • When "Buddy: build apks" stage completes successfully, BlueOcean updates to show all the correct stage information.
            • "Buddy: jvm verify" stage shows progress correctly in BlueOcean
            def buildAgent = 'ec2-android-toolchain'
            
            pipeline {
                agent none
            
                options{
                    timestamps()
                    timeout(time: 2, unit: 'HOURS')
                    parallelsAlwaysFailFast()
                }
            
                parameters {
                    string(
                            defaultValue: "",
                            description: 'If set, reruns the device test stages only,using artifact from specified build id, skipping all other stages',
                            name: "RERUN_DEVICETEST",
                    )
            
                    booleanParam(
                            defaultValue: false,
                            description: 'Publish release build to crashlytics beta',
                            name: "PUBLISH_BETA",
                    )
                }
            
                environment {
                    //Disable gradle daemon
                    GRADLE_OPTS = "-Dorg.gradle.daemon=false"
                }
            
                stages {
                    stage('Jenkins') {
                        parallel {
                            stage('Release: build') {
                                when {
                                    beforeAgent true
                                    equals expected: PipelineMode.RELEASE, actual: getPipelineMode()
                                    expression { !isDeviceTestOnly() }
                                }
            
                                agent {
                                    label buildAgent
                                }
            
                                environment {
                                    // Number of build numbers to allocate when building the project
                                    // (release builds only)
                                    O2_BUILD_VERSION_BLOCK_SIZE = "64"
            
                                    ORG_GRADLE_PROJECT_publishToBeta = "${params.PUBLISH_BETA ? '1' : '0'}"
                                }
            
                                steps {
                                    installJenkinsSshKey()
                                    checkoutRepo(steps)
                                    sh "./other/jenkins/job_release_build.sh"
                                }
            
                                post {
                                    always {
                                        addArtifacts()
                                        processJvmJunitResults()
                                    }
                                }
                            }
            
                            stage('Buddy: build apks') {
                                when {
                                    beforeAgent true
                                    equals expected: PipelineMode.BUDDY, actual: getPipelineMode()
                                    expression { !isDeviceTestOnly() }
                                }
            
                                agent {
                                    label buildAgent
                                }
            
                                steps {
                                    installJenkinsSshKey()
                                    checkoutRepo(steps)
                                    sh '''#!/bin/bash
            
                                        . "other/jenkins/build_lib.sh"
            
                                        "$ST_ROOT/other/jenkins/initialize_build.sh"
            
                                        execGradle --stacktrace jenkinsBuddyStageBuild
            
                                        "$ST_ROOT/other/jenkins/job_build_upload_sharedartifacts.sh"
                                    '''
                                }
            
                                post {
                                    always {
                                        addArtifacts()
                                    }
                                }
                            }
            
                            stage('Buddy: jvm verify') {
                                when {
                                    beforeAgent true
                                    equals expected: PipelineMode.BUDDY, actual: getPipelineMode()
                                    expression { !isDeviceTestOnly() }
                                }
            
                                agent {
                                    label buildAgent
                                }
            
                                steps {
                                    installJenkinsSshKey()
                                    checkoutRepo(steps)
                                    sh '''#!/bin/bash
            
                                        . "other/jenkins/build_lib.sh"
            
                                        "$ST_ROOT/other/jenkins/initialize_build.sh"
            
                                        execGradle --stacktrace jenkinsBuddyStageVerifyAndTest
                                    '''
                                }
            
                                post {
                                    always {
                                        processJvmJunitResults()
                                    }
                                }
                            }
                        }
                    }
                }
            
                post {
                    failure {
                        notifyUnhealthy()
                    }
            
                    fixed {
                        notifyFixed()
                    }
            
                    unstable {
                        notifyUnhealthy()
                    }
                }
            }
            
            def checkoutRepo(steps) {
                steps.sh '''#!/bin/bash
                git submodule init
                git submodule update
                '''
            }
            
            /**
             * This should only be used on non-ec2 agents. Roles are used to properly authorize
             * agents running in ev2
             *
             * @param cl
             * @return
             */
            @SuppressWarnings("GroovyInfiniteRecursion") // arg list is different and doesn't recurse
            def withAwsCredentials(Closure cl) {
                withAwsCredentials('**********', cl)
            }
            
            /**
             * Installs the jenkins ssh key managed by jenkins
             */
            def installJenkinsSshKey() {
                installSshKey('jenkins-ssh-key',
                        "${env.HOME}/.ssh/id_rsa")
            }
            
            /**
             * Installs an ssh managed by jenkins to the specified path
             *
             * @param credentialId
             * @param destPath
             */
            def installSshKey(String credentialId, String destPath) {
                withCredentials([sshUserPrivateKey(
                        keyFileVariable: 'THEKEY',
                        credentialsId: credentialId)]) {
            
                    sh """
                        cp "$THEKEY" "$destPath"
                        chmod 0600 "$destPath"
                    """
                }
            }
            
            enum PipelineMode {
                BUDDY, // Buddy mode for pipeline
                RELEASE, // Release mode for pipeline
                UNKNOWN // Unknown/unsupported mode. All stages skipped.
            }
            
            /**
             * Each jenkins pipeline execution runs in a single mode
             */
            def getPipelineMode() {
                if (env.CHANGE_ID != null) {
                    return PipelineMode.BUDDY
                } else if (isMainlineBranch()) {
                    return PipelineMode.RELEASE
                } else {
                    return PipelineMode.UNKNOWN
                }
            }
            
            /**
             * Returns true if any of the current branches are a mainline branch, otherwise false
             */
            def isMainlineBranch() {
                def mainlines = [ /^master$/, /^release\/.*$/, /^topic\/.*$/ ]
                return null != scm.branches.find { branch ->
                    mainlines.find { mainlineRegEx ->
                        return branch ==~ mainlineRegEx
                    }
                }
            }
            
            /**
             * Return true if only the device test stages should run
             * @return
             */
            def isDeviceTestOnly() {
                return !params.RERUN_DEVICETEST.isEmpty()
            }
            
            /**
             * Returns true if the device test stage(s) should run
             */
            def shouldRunDeviceTest() {
                return getPipelineMode() != PipelineMode.UNKNOWN
            }
            
            /**
             * Given a steps element, configures it to run the given shard indexes
             */
            def defineStepsForDeviceTest(steps, int totalShards, int...shardIndexes) {
                steps.with {
                    checkoutRepo(steps)
            
                    withAwsCredentials {
                        def url = getArtifactShareUrl("build", deviceTestInputBuildId())
                        sh "./other/jenkins/devicetest_ensure_ready.sh '$url'"
                    }
            
                    script {
                        // PR builds merge the PR HEAD into the target branch. In this case
                        // we just want to pass the PR branch to the manual job. If
                        // it isn't a PR build, the pass the commit.
                        def commitToBuild = env.CHANGE_BRANCH?.trim()
                        if (!commitToBuild) {
                            commitToBuild = env.GIT_COMMIT
                        }
            
                        build job: "/Manual Builds/speedtestnet-android/deviceTest",
                                parameters: [
                                        string(name: 'O2_INPUT_URL', value:
                                                getArtifactShareUrl("build", deviceTestInputBuildId())),
                                        string(name: 'O2_OUTPUT_URL', value:
                                                getArtifactShareUrl("deviceTest", deviceTestInputBuildId())),
                                        string(name: 'O2_COMMIT', value: commitToBuild),
                                        string(name: 'O2_BRANCH', value: env.GIT_BRANCH),
                                        string(name: 'O2_SHARD_COUNT', value: "${totalShards}"),
                                        string(name: 'O2_SHARD_INDEXES', value: shardIndexes.join(','))
                                ],
            
                                // Don't fail the stage if downstream job fails. We want to process
                                // test results from unstable builds in our pipeline
                                propagate: false
                    }
                }
            }
            
            /**
             * Build id to use as input for device test stage
             */
            def deviceTestInputBuildId() {
                return params.RERUN_DEVICETEST.isEmpty()
                        ? env.BUILD_ID
                        : params.RERUN_DEVICETEST
            }
            
            /**
             * Get the artifact share path for the given stage.
             * @param stage the stage with which the path is associated
             * @param buildId (optional) build id to use, defaults to current build id
             */
            def getArtifactShareUrl(String stage, String buildId = env.BUILD_ID) {
                return sh(returnStdout: true, script: "./other/jenkins/share_artifacts.sh "
                        + "--action getShareUrl "
                        + "--jobName ${env.JOB_NAME} --jobBuild ${buildId} --jobStage ${stage}"
                ).trim()
            }
            
            /**
             * Process junit results from the jvm tests
             */
            def processJvmJunitResults() {
                junit '**/build/test-results/**/*.xml'
            }
            /**
             * Add the archived apk's as artifacts
             */
            def addArtifacts() {
                archiveArtifacts artifacts:"Mobile4/build/outputs/apk/**/*.apk", fingerprint:true
            }
            
            def notifyUnhealthy() {
                if (getPipelineMode() != PipelineMode.RELEASE) {
                    return
                }
            
                slackSend(channel: '#team-android-dev',
                        color: 'danger',
                        message: "<${currentBuild.absoluteUrl}|${currentBuild.fullDisplayName}>"
                                + " :jenkins_angry:  is unhealthy.")
            }
            
            def notifyFixed() {
                if (getPipelineMode() != PipelineMode.RELEASE) {
                    return
                }
            
                slackSend(channel: '#team-android-dev',
                        color: 'good',
                        message: "<${currentBuild.absoluteUrl}|${currentBuild.fullDisplayName}>"
                                + ":jenkins: is healthy again.")
            }
            Show
            ian_ookla Ian Wallace-Hoyt added a comment - The issue repros nearly 100% of the time using this pipeline when a "buddy build" runs. Key details: Jenkins 2.204.1 Blue Ocean 1.21.0 Amazon EC2 plugin 1.47 ec2-android-toolchain: this is an ec2 instance provisioned using the ec2 plugin Issue repros when a PR is opened, this causing "Release: build" stage is skipped Stages "Buddy: build apks" and "Buddy: jvm verify" execute in parallel "Buddy: build apks" stage always shows "Queued: Waiting for run to start" in BlueOcean, even though it runs and completes. When "Buddy: build apks" stage completes successfully, BlueOcean updates to show all the correct stage information. "Buddy: jvm verify" stage shows progress correctly in BlueOcean def buildAgent = 'ec2-android-toolchain' pipeline { agent none options{ timestamps() timeout(time: 2, unit: 'HOURS' ) parallelsAlwaysFailFast() } parameters { string( defaultValue: "", description: 'If set, reruns the device test stages only,using artifact from specified build id, skipping all other stages' , name: "RERUN_DEVICETEST" , ) booleanParam( defaultValue: false , description: 'Publish release build to crashlytics beta' , name: "PUBLISH_BETA" , ) } environment { //Disable gradle daemon GRADLE_OPTS = "-Dorg.gradle.daemon= false " } stages { stage( 'Jenkins' ) { parallel { stage( 'Release: build' ) { when { beforeAgent true equals expected: PipelineMode.RELEASE, actual: getPipelineMode() expression { !isDeviceTestOnly() } } agent { label buildAgent } environment { // Number of build numbers to allocate when building the project // (release builds only) O2_BUILD_VERSION_BLOCK_SIZE = "64" ORG_GRADLE_PROJECT_publishToBeta = "${params.PUBLISH_BETA ? '1' : '0' }" } steps { installJenkinsSshKey() checkoutRepo(steps) sh "./other/jenkins/job_release_build.sh" } post { always { addArtifacts() processJvmJunitResults() } } } stage( 'Buddy: build apks' ) { when { beforeAgent true equals expected: PipelineMode.BUDDY, actual: getPipelineMode() expression { !isDeviceTestOnly() } } agent { label buildAgent } steps { installJenkinsSshKey() checkoutRepo(steps) sh '''#!/bin/bash . "other/jenkins/build_lib.sh" "$ST_ROOT/other/jenkins/initialize_build.sh" execGradle --stacktrace jenkinsBuddyStageBuild "$ST_ROOT/other/jenkins/job_build_upload_sharedartifacts.sh" ''' } post { always { addArtifacts() } } } stage( 'Buddy: jvm verify' ) { when { beforeAgent true equals expected: PipelineMode.BUDDY, actual: getPipelineMode() expression { !isDeviceTestOnly() } } agent { label buildAgent } steps { installJenkinsSshKey() checkoutRepo(steps) sh '''#!/bin/bash . "other/jenkins/build_lib.sh" "$ST_ROOT/other/jenkins/initialize_build.sh" execGradle --stacktrace jenkinsBuddyStageVerifyAndTest ''' } post { always { processJvmJunitResults() } } } } } } post { failure { notifyUnhealthy() } fixed { notifyFixed() } unstable { notifyUnhealthy() } } } def checkoutRepo(steps) { steps.sh '''#!/bin/bash git submodule init git submodule update ''' } /** * This should only be used on non-ec2 agents. Roles are used to properly authorize * agents running in ev2 * * @param cl * @ return */ @SuppressWarnings( "GroovyInfiniteRecursion" ) // arg list is different and doesn't recurse def withAwsCredentials(Closure cl) { withAwsCredentials( '**********' , cl) } /** * Installs the jenkins ssh key managed by jenkins */ def installJenkinsSshKey() { installSshKey( 'jenkins-ssh-key' , "${env.HOME}/.ssh/id_rsa" ) } /** * Installs an ssh managed by jenkins to the specified path * * @param credentialId * @param destPath */ def installSshKey( String credentialId, String destPath) { withCredentials([sshUserPrivateKey( keyFileVariable: 'THEKEY' , credentialsId: credentialId)]) { sh """ cp "$THEKEY" "$destPath" chmod 0600 "$destPath" """ } } enum PipelineMode { BUDDY, // Buddy mode for pipeline RELEASE, // Release mode for pipeline UNKNOWN // Unknown/unsupported mode. All stages skipped. } /** * Each jenkins pipeline execution runs in a single mode */ def getPipelineMode() { if (env.CHANGE_ID != null ) { return PipelineMode.BUDDY } else if (isMainlineBranch()) { return PipelineMode.RELEASE } else { return PipelineMode.UNKNOWN } } /** * Returns true if any of the current branches are a mainline branch, otherwise false */ def isMainlineBranch() { def mainlines = [ /^master$/, /^release\/.*$/, /^topic\/.*$/ ] return null != scm.branches.find { branch -> mainlines.find { mainlineRegEx -> return branch ==~ mainlineRegEx } } } /** * Return true if only the device test stages should run * @ return */ def isDeviceTestOnly() { return !params.RERUN_DEVICETEST.isEmpty() } /** * Returns true if the device test stage(s) should run */ def shouldRunDeviceTest() { return getPipelineMode() != PipelineMode.UNKNOWN } /** * Given a steps element, configures it to run the given shard indexes */ def defineStepsForDeviceTest(steps, int totalShards, int ...shardIndexes) { steps.with { checkoutRepo(steps) withAwsCredentials { def url = getArtifactShareUrl( "build" , deviceTestInputBuildId()) sh "./other/jenkins/devicetest_ensure_ready.sh '$url' " } script { // PR builds merge the PR HEAD into the target branch. In this case // we just want to pass the PR branch to the manual job. If // it isn't a PR build, the pass the commit. def commitToBuild = env.CHANGE_BRANCH?.trim() if (!commitToBuild) { commitToBuild = env.GIT_COMMIT } build job: "/Manual Builds/speedtestnet-android/deviceTest" , parameters: [ string(name: 'O2_INPUT_URL' , value: getArtifactShareUrl( "build" , deviceTestInputBuildId())), string(name: 'O2_OUTPUT_URL' , value: getArtifactShareUrl( "deviceTest" , deviceTestInputBuildId())), string(name: 'O2_COMMIT' , value: commitToBuild), string(name: 'O2_BRANCH' , value: env.GIT_BRANCH), string(name: 'O2_SHARD_COUNT' , value: "${totalShards}" ), string(name: 'O2_SHARD_INDEXES' , value: shardIndexes.join( ',' )) ], // Don't fail the stage if downstream job fails. We want to process // test results from unstable builds in our pipeline propagate: false } } } /** * Build id to use as input for device test stage */ def deviceTestInputBuildId() { return params.RERUN_DEVICETEST.isEmpty() ? env.BUILD_ID : params.RERUN_DEVICETEST } /** * Get the artifact share path for the given stage. * @param stage the stage with which the path is associated * @param buildId (optional) build id to use, defaults to current build id */ def getArtifactShareUrl( String stage, String buildId = env.BUILD_ID) { return sh(returnStdout: true , script: "./other/jenkins/share_artifacts.sh " + "--action getShareUrl " + "--jobName ${env.JOB_NAME} --jobBuild ${buildId} --jobStage ${stage}" ).trim() } /** * Process junit results from the jvm tests */ def processJvmJunitResults() { junit '**/build/test-results /**/ *.xml' } /** * Add the archived apk's as artifacts */ def addArtifacts() { archiveArtifacts artifacts: "Mobile4/build/outputs/apk /**/ *.apk" , fingerprint: true } def notifyUnhealthy() { if (getPipelineMode() != PipelineMode.RELEASE) { return } slackSend(channel: '#team-android-dev' , color: 'danger' , message: "<${currentBuild.absoluteUrl}|${currentBuild.fullDisplayName}>" + " :jenkins_angry: is unhealthy." ) } def notifyFixed() { if (getPipelineMode() != PipelineMode.RELEASE) { return } slackSend(channel: '#team-android-dev' , color: 'good' , message: "<${currentBuild.absoluteUrl}|${currentBuild.fullDisplayName}>" + ":jenkins: is healthy again." ) }
            Hide
            dnusbaum Devin Nusbaum added a comment -

            Ian Wallace-Hoyt Thanks for the Jenkinsfile! My best guess is that the unfixed part of the issue has to do with having when expressions on more than one stage, or maybe beforeAgent: true, but I'm not sure. Do you have a screenshot of what the graph looks like for your Pipeline when you have the issue?

            Given the comments, I'll go ahead and reopen the issue. I am not sure whether the fixes in Blue Ocean 1.19.0 fixed a significant amount of the ways this problem could happen, and we are just left with some special cases, or if there are still a lot of outstanding issues.

            Show
            dnusbaum Devin Nusbaum added a comment - Ian Wallace-Hoyt Thanks for the Jenkinsfile! My best guess is that the unfixed part of the issue has to do with having when expressions on more than one stage, or maybe beforeAgent: true , but I'm not sure. Do you have a screenshot of what the graph looks like for your Pipeline when you have the issue? Given the comments, I'll go ahead and reopen the issue. I am not sure whether the fixes in Blue Ocean 1.19.0 fixed a significant amount of the ways this problem could happen, and we are just left with some special cases, or if there are still a lot of outstanding issues.
            dnusbaum Devin Nusbaum made changes -
            Resolution Fixed [ 1 ]
            Status Resolved [ 5 ] Reopened [ 4 ]
            Assignee Devin Nusbaum [ dnusbaum ]
            Hide
            brianjmurrell Brian J Murrell added a comment -

            Devin Nusbaum A screenshot of the graph for the pipeline when in this state looks just like the existing two screenshots.

            Show
            brianjmurrell Brian J Murrell added a comment - Devin Nusbaum A screenshot of the graph for the pipeline when in this state looks just like the existing two screenshots.
            Hide
            dnusbaum Devin Nusbaum added a comment -

            Brian J Murrell Yes, I mean for Ian Wallace-Hoyt's Pipeline in particular, since we have the associated Jenkinsfile we could use the graph to see what other stages are being shown or not and in what state. Even better would be to have a screenshot of the graph from build with the issue and a screenshot of the graph for the previous build, since the problem relates to how those graphs are being combined.

            Show
            dnusbaum Devin Nusbaum added a comment - Brian J Murrell Yes, I mean for Ian Wallace-Hoyt 's Pipeline in particular, since we have the associated Jenkinsfile we could use the graph to see what other stages are being shown or not and in what state. Even better would be to have a screenshot of the graph from build with the issue and a screenshot of the graph for the previous build, since the problem relates to how those graphs are being combined.
            ian_ookla Ian Wallace-Hoyt made changes -
            Attachment jenkins_build1.png [ 50304 ]
            ian_ookla Ian Wallace-Hoyt made changes -
            Attachment jenkins_build2.png [ 50305 ]
            ian_ookla Ian Wallace-Hoyt made changes -
            Attachment jenkins_build1.mov [ 50306 ]
            ian_ookla Ian Wallace-Hoyt made changes -
            Attachment jenkins_build2.png [ 50307 ]
            Hide
            ian_ookla Ian Wallace-Hoyt added a comment -

            Devin Nusbaum

             

            I opened a new PR and let it build. When it completed successfully, I rebuilt it. Same thing in both cases, the "Buddy: build apks" stage shows a queued.

             

            I took the screenshots and video after verifying that the stage was actually building but looking at the node view.

             

            jenkins_build1.png

            jenkins_build1.mov

            jenkins_buld2.png

            jenkins_buld2.mov

             

            Show
            ian_ookla Ian Wallace-Hoyt added a comment - Devin Nusbaum   I opened a new PR and let it build. When it completed successfully, I rebuilt it. Same thing in both cases, the "Buddy: build apks" stage shows a queued.   I took the screenshots and video after verifying that the stage was actually building but looking at the node view.   jenkins_build1.png jenkins_build1.mov jenkins_buld2.png jenkins_buld2.mov  
            ian_ookla Ian Wallace-Hoyt made changes -
            Attachment jenkins_build2.mov [ 50308 ]
            ian_ookla Ian Wallace-Hoyt made changes -
            Attachment jenkins_build2.png [ 50305 ]
            Hide
            rankylau Ranky Lau added a comment -

            Is there anyone working at this issue right now?

            Show
            rankylau Ranky Lau added a comment - Is there anyone working at this issue right now?
            Hide
            borisivan boris ivan added a comment -

            this one really hurts, hoping it can be fixed since it really is a bug and not an enhancement.

            Show
            borisivan boris ivan added a comment - this one really hurts, hoping it can be fixed since it really is a bug and not an enhancement.
            Hide
            rasmuskvoss__ Rasmus Voss added a comment - - edited

            Hi,

            I have the same issue with a stage like this.

            But behind the scenes the stage is running without showing any logs in blueocean.

            So eventually the pipeline completes

            stage ('HTML Clients') {
             parallel  {
                stage('Designer - msch21') {
                    when { environment name: 'BRANCH_TYPE', value: 'branch' }
                    steps {
                        sh (script: '''
                            command
                        ''')
                    }
                }
                stage('Designer - selfhost') {
                    when { environment name: 'BRANCH_TYPE', value: 'release' }
                    steps {
                        sh (script: '''
                            command
                        ''')
                    }
                }
                stage('System-On - msch21') {
                    when { environment name: 'BRANCH_TYPE', value: 'branch' }
                    steps {
                        sh (script: '''
                            command
                        ''')
                    }
                }
                stage('System-On - selfhost') {
                    when { environment name: 'BRANCH_TYPE', value: 'release' }
                    steps {
                        sh (script: '''
                            command
                        ''')
                    }
                }
            }}
            

             

             

            Show
            rasmuskvoss__ Rasmus Voss added a comment - - edited Hi, I have the same issue with a stage like this. But behind the scenes the stage is running without showing any logs in blueocean. So eventually the pipeline completes stage ( 'HTML Clients' ) { parallel { stage( 'Designer - msch21' ) { when { environment name: 'BRANCH_TYPE' , value: 'branch' } steps { sh (script: ''' command ''') } } stage( 'Designer - selfhost' ) { when { environment name: 'BRANCH_TYPE' , value: 'release' } steps { sh (script: ''' command ''') } } stage( ' System -On - msch21' ) { when { environment name: 'BRANCH_TYPE' , value: 'branch' } steps { sh (script: ''' command ''') } } stage( ' System -On - selfhost' ) { when { environment name: 'BRANCH_TYPE' , value: 'release' } steps { sh (script: ''' command ''') } } }}    
            Hide
            d1morto Donald Morton added a comment -

            I'm seeing the same issue on Jenkins 2.235.3 and Blue Ocean 1.23.2.

            Show
            d1morto Donald Morton added a comment - I'm seeing the same issue on Jenkins 2.235.3 and Blue Ocean 1.23.2.

              People

              • Assignee:
                Unassigned
                Reporter:
                pracucci Marco Pracucci
              • Votes:
                29 Vote for this issue
                Watchers:
                39 Start watching this issue

                Dates

                • Created:
                  Updated: