Uploaded image for project: 'Jenkins'
  1. Jenkins
  2. JENKINS-41929

Offer "Build with Parameters" on first build when declarative Jenkinsfile found

    Details

    • Similar Issues:

      Description

      By default a branch project will automatically run the first build, with no parameters, so params will just pick up any default values. You have the option to suppress the automatic first build, but this does not give you any way to enter parameters for it (at least in the UI; perhaps possible via CLI/REST), since Jenkins does not know what the parameters are going to be until it starts running. But in the case of Declarative we could in principle inspect the Jenkinsfile when the branch project is created (via SCMFileSystem) and determine the parameter definitions by static parsing without actually running.

      More generally, if Declarative is in use and there are properties, we could set all the project properties when the branch project is created, even if the first build is run automatically. (Though I would suggest that the automatic first build should be automatically suppressed if there is a ParametersDefinitionProperty.)

        Attachments

          Issue Links

            Activity

            jglick Jesse Glick created issue -
            jglick Jesse Glick made changes -
            Field Original Value New Value
            Link This issue relates to JENKINS-41865 [ JENKINS-41865 ]
            jglick Jesse Glick made changes -
            Link This issue is duplicated by JENKINS-42922 [ JENKINS-42922 ]
            abayer Andrew Bayer made changes -
            Link This issue is duplicated by JENKINS-40241 [ JENKINS-40241 ]
            Hide
            michaelneale Michael Neale added a comment -

            if one of the parameters is "launch missile" and defaults to yes.. may be of concern. 

            Show
            michaelneale Michael Neale added a comment - if one of the parameters is "launch missile" and defaults to yes.. may be of concern. 
            jglick Jesse Glick made changes -
            Link This issue is duplicated by JENKINS-45454 [ JENKINS-45454 ]
            childnode Marcel 'childNo͡.de' Trautwein made changes -
            Link This issue is duplicated by JENKINS-42922 [ JENKINS-42922 ]
            childnode Marcel 'childNo͡.de' Trautwein made changes -
            Link This issue duplicates JENKINS-40574 [ JENKINS-40574 ]
            abayer Andrew Bayer made changes -
            Link This issue is duplicated by JENKINS-40574 [ JENKINS-40574 ]
            abayer Andrew Bayer made changes -
            Link This issue is duplicated by JENKINS-46594 [ JENKINS-46594 ]
            Hide
            ssbarnea Sorin Sbarnea added a comment - - edited

            This problem is even worse as it also applies to normal Pipelines (non multi-branch) and also applies to both declarative and scripted pipelines.

            This means that we created the pipeline and we are forced to run the job before in order to see any parameters.

            This is causing other problems because

            • job with default parameters is likely to fail for some reasons
            • user has no idea what is going to run
            • the job duration can be huge, as and many hours and we cannot wait for it to finish to be able to trigger it with params.
            • if the job fails or is aborted the params do no become available.

            That's what I would call a critical bug related to pipelines which is forcing us to configure the parameters using the old style (xml) definition, in our case via JJB.

            I am wondering if there is a temporary workaround where we can add some pipeline groovy code that does: if no previous build, retrigger itself and return success. If an approach like this would work it would mean that jobs would quickly self-repair on their first execution.

            Show
            ssbarnea Sorin Sbarnea added a comment - - edited This problem is even worse as it also applies to normal Pipelines (non multi-branch) and also applies to both declarative and scripted pipelines . This means that we created the pipeline and we are forced to run the job before in order to see any parameters. This is causing other problems because job with default parameters is likely to fail for some reasons user has no idea what is going to run the job duration can be huge, as and many hours and we cannot wait for it to finish to be able to trigger it with params. if the job fails or is aborted the params do no become available. That's what I would call a critical bug related to pipelines which is forcing us to configure the parameters using the old style (xml) definition, in our case via JJB. I am wondering if there is a temporary workaround where we can add some pipeline groovy code that does: if no previous build, retrigger itself and return success. If an approach like this would work it would mean that jobs would quickly self-repair on their first execution.
            Hide
            michaelneale Michael Neale added a comment -

            Sorin Sbarnea glad you brought it up - this is really annoying yes. 

            I think declarative has the best chance to solve this.. maybe. Would like to talk to Andrew Bayer about this. 

            Keep thinking up any other ideas... 

            Show
            michaelneale Michael Neale added a comment - Sorin Sbarnea glad you brought it up - this is really annoying yes.  I think declarative has the best chance to solve this.. maybe. Would like to talk to Andrew Bayer about this.  Keep thinking up any other ideas... 
            Hide
            abayer Andrew Bayer added a comment -

            So I've thought on this a fair amount and haven't been comfortable enough with any of the Declarative possibilities to pursue one yet...

            Show
            abayer Andrew Bayer added a comment - So I've thought on this a fair amount and haven't been comfortable enough with any of the Declarative possibilities to pursue one yet...
            Hide
            mkorejo Murad Korejo added a comment -

            How about a special "Refresh Parameters" option for pipeline projects? It won't run the full job but "processes" the pipeline script by downloading latest from source and updates things in the job config such that new/changed params are reflected.

            Show
            mkorejo Murad Korejo added a comment - How about a special "Refresh Parameters" option for pipeline projects? It won't run the full job but "processes" the pipeline script by downloading latest from source and updates things in the job config such that new/changed params are reflected.
            Hide
            michaelneale Michael Neale added a comment -

            Murad Korejo the problem with scripted pipeline is there is no way to evaluate the code to set the properties without running the whole thing, as it is programmatic. Would need to way to identify specific calls to set parameters and hope they are not based on dynamic variables etc. 

            Show
            michaelneale Michael Neale added a comment - Murad Korejo the problem with scripted pipeline is there is no way to evaluate the code to set the properties without running the whole thing, as it is programmatic. Would need to way to identify specific calls to set parameters and hope they are not based on dynamic variables etc. 
            Hide
            ssbarnea Sorin Sbarnea added a comment -

            The only dynamic part I seen and used so far for parameters was to set the default value to a value based on an environment variable. This is a common practice for allowing user to specify an override value on a specific run but to control the default value at master level.

            When you have 1000+ jobs, you don't want to update all of them because you decided to change the default value of one variable.

            Other than this, I don't think they are dynamic, and TBH it would not make sense because you reach the chicken and the egg problem: what comes first the first job execution or the parameter processing?

             

            Show
            ssbarnea Sorin Sbarnea added a comment - The only dynamic part I seen and used so far for parameters was to set the default value to a value based on an environment variable. This is a common practice for allowing user to specify an override value on a specific run but to control the default value at master level. When you have 1000+ jobs, you don't want to update all of them because you decided to change the default value of one variable. Other than this, I don't think they are dynamic, and TBH it would not make sense because you reach the chicken and the egg problem: what comes first the first job execution or the parameter processing?  
            darwinjs Darwin Sanoy made changes -
            Comment [ Parameters prompting is also skipped when you simply run a declarative pipeline (that contains a parameters block) from the classic UI.  Since everything else about Declarative Pipelines seems to work in Classic UI, I'm thinking this is a bug.

            Perhaps the two are related since "Scan Multibranch Pipeline Now" is only available on Classic UI ? ]
            Hide
            vonnagy_loggly Ivan von Nagy added a comment -

            Any idea on when this might be picked up as it is a major hindrance from us using pipelines. All PRs will fail an automated test since the PR will result in a new branch and the pipeline will then fail as the parameters will not be set.

             

            BTW, anyone have a good workaround for now? For example, check for null parameters in pipeline and trigger a complete restart of the pipeline so that the parameters are picked up.

            Show
            vonnagy_loggly Ivan von Nagy added a comment - Any idea on when this might be picked up as it is a major hindrance from us using pipelines. All PRs will fail an automated test since the PR will result in a new branch and the pipeline will then fail as the parameters will not be set.   BTW, anyone have a good workaround for now? For example, check for null parameters in pipeline and trigger a complete restart of the pipeline so that the parameters are picked up.
            Hide
            michaelneale Michael Neale added a comment -

            Ivan von Nagy still have no clue how to implement this, short of changing a fair bit about how pipeline works, given that the parameters are defined as a script. 

            Would be interested in talking other ideas, perhaps the params could be setup as part of a creation wizard/gui (before the pipeline exists) but that won't work for existing Jenkinsfiles... 

            Show
            michaelneale Michael Neale added a comment - Ivan von Nagy still have no clue how to implement this, short of changing a fair bit about how pipeline works, given that the parameters are defined as a script.  Would be interested in talking other ideas, perhaps the params could be setup as part of a creation wizard/gui (before the pipeline exists) but that won't work for existing Jenkinsfiles... 
            Hide
            ctadeu Carlos Tadeu Panato added a comment - - edited

            What I did to make a workaround is: we have a job that triggers our multibranch and in this trigger, we start the job and after that, we check if that job is the first or not if it is the first time we kill that and retrigger with same parameters.

            I did using the post groovy script

            Show
            ctadeu Carlos Tadeu Panato added a comment - - edited What I did to make a workaround is: we have a job that triggers our multibranch and in this trigger, we start the job and after that, we check if that job is the first or not if it is the first time we kill that and retrigger with same parameters. I did using the post groovy script
            Hide
            vonnagy_loggly Ivan von Nagy added a comment -

            For better or worse, I added this logic to check for a param at the beginning and fail fast while kicking off a new build on the branch.

             

            // this stage checks for null parameters which usually occur when a new branch is discovered. See the following
            // for more details: https://issues.jenkins-ci.org/browse/JENKINS-41929
            stage('Validate parameters') {
              when {
                expression {
                  // Only run this stage if the BUILD_IMAGE is invalid
                  return !(env.BUILD_IMAGE)
                }
              }
              steps {
                withCredentials([string(credentialsId:'jenkins-build', variable:'TOKEN')]) {
                    sh '''
                        set +x
                        RETRY_BRANCH=$(python -c 'import urllib, sys; print urllib.quote(sys.argv[1], sys.argv[2])' "${BRANCH_NAME}" "")
                        curl --user "service@foo.com:$TOKEN" -X POST -H "Content-Type: application/json" "http://localhost:8080/blue/rest/organizations/jenkins/pipelines/My%20Pipeline(s)/branches/${RETRY_BRANCH}/runs/${BUILD_ID}/replay"
                    '''
                }
            
                // Abort the build, skipping subsequent stages
                error("Aborting build since parameters are invalid")
              }
            }
            
            Show
            vonnagy_loggly Ivan von Nagy added a comment - For better or worse, I added this logic to check for a param at the beginning and fail fast while kicking off a new build on the branch.   // this stage checks for null parameters which usually occur when a new branch is discovered. See the following // for more details: https://issues.jenkins-ci.org/browse/JENKINS-41929 stage( 'Validate parameters' ) { when { expression { // Only run this stage if the BUILD_IMAGE is invalid return !(env.BUILD_IMAGE) } } steps { withCredentials([string(credentialsId: 'jenkins-build' , variable: 'TOKEN' )]) { sh ''' set +x RETRY_BRANCH=$(python -c ' import urllib, sys; print urllib.quote(sys.argv[1], sys.argv[2])' "${BRANCH_NAME}" "") curl --user "service@foo.com:$TOKEN" -X POST -H "Content-Type: application/json" "http: //localhost:8080/blue/ rest /organizations/jenkins/pipelines/My%20Pipeline(s)/branches/${RETRY_BRANCH}/runs/${BUILD_ID}/replay" ''' } // Abort the build, skipping subsequent stages error( "Aborting build since parameters are invalid" ) } }
            jamesdumay James Dumay made changes -
            Remote Link This issue links to "CloudBees Internal OSS-2277 (Web Link)" [ 18360 ]
            jamesdumay James Dumay made changes -
            Remote Link This issue links to "CloudBees Internal OSS-2103 (Web Link)" [ 18419 ]
            jglick Jesse Glick made changes -
            Link This issue relates to JENKINS-49079 [ JENKINS-49079 ]
            Hide
            dmitrybond Dmitry Bondarenko added a comment -

            In JENKINS 2.107.2 (April 2018) there are some minor problems with parameters.
            In particular - parameters sometimes are not recognized from pipeline definitions. I need to start build couple of times as-is to make parameters recognized.
            Also when I added new a "choice" parameter to existing parameters it was not recognized at all. So, I have to add it in UI (separately from pipeline definitions). And pipeline definitions were not sync from UI parameters, so such parameter was defined in UI for pipeline but it was not added to pipeline script. Thus, "pipeline script" and "pipeline parameters" in UI editor can be very different.

            Also a bad thing - it does not provide a possibility to specify captions for choice-values!
            For example I would like to use INI-file format for such purposes, so like this - choise: ['R2010B=Release 2010.2\nR2015=Release 9.0\nR2016=Release 9.1\nR2017=Release 2017.1'] - such format will be much more useful for end-user. So, when calling a build script it should use choice value "R2017" and the "Release 2017.1" text is only to show in UI for user when choosing a parameter value for a build.

            Show
            dmitrybond Dmitry Bondarenko added a comment - In JENKINS 2.107.2 (April 2018) there are some minor problems with parameters. In particular - parameters sometimes are not recognized from pipeline definitions. I need to start build couple of times as-is to make parameters recognized. Also when I added new a "choice" parameter to existing parameters it was not recognized at all. So, I have to add it in UI (separately from pipeline definitions). And pipeline definitions were not sync from UI parameters, so such parameter was defined in UI for pipeline but it was not added to pipeline script. Thus, "pipeline script" and "pipeline parameters" in UI editor can be very different. Also a bad thing - it does not provide a possibility to specify captions for choice-values! For example I would like to use INI-file format for such purposes, so like this - choise: ['R2010B=Release 2010.2\nR2015=Release 9.0\nR2016=Release 9.1\nR2017=Release 2017.1'] - such format will be much more useful for end-user. So, when calling a build script it should use choice value "R2017" and the "Release 2017.1" text is only to show in UI for user when choosing a parameter value for a build.
            Hide
            kiruahxh Kiruahxh added a comment -

            @Mention Murad Korejo

            How about a special "Refresh Parameters" option for pipeline projects? It won't run the full job but "processes" the pipeline script by downloading latest from source and updates things in the job config such that new/changed params are reflected.

            I agree, a button or an option "Refresh parameters" would be ok for scripted pipelines.
            The job could have a refreshParameter variable in input, and its status should be discarded.

            Another possibility would be to split the Jenkinsfile in two : one file for job properties and parameters, and another for the execution script. Ex : Jenkinsproperties ans Jenkinsfile

            As a workaround, I put a "refresh parameter" option in my jobs.

            Show
            kiruahxh Kiruahxh added a comment - @Mention Murad Korejo How about a special "Refresh Parameters" option for pipeline projects? It won't run the full job but "processes" the pipeline script by downloading latest from source and updates things in the job config such that new/changed params are reflected. I agree, a button or an option "Refresh parameters" would be ok for scripted pipelines. The job could have a refreshParameter variable in input, and its status should be discarded. Another possibility would be to split the Jenkinsfile in two : one file for job properties and parameters, and another for the execution script. Ex : Jenkinsproperties ans Jenkinsfile As a workaround, I put a "refresh parameter" option in my jobs.
            Hide
            jtmorton John Morton added a comment -

            I would highly suggest implementing this feature. This makes using parameterized Declarative Pipelines a hassle for us and either has us using non-ideal workarounds.

            There should just be a button for all pipeline jobs that says "re-pull from SCM". This should pull the Jenkinsfile from SCM and changes it from a default unparameterized build to a parameterized build but does not actually run a "build".

            Show
            jtmorton John Morton added a comment - I would highly suggest implementing this feature. This makes using parameterized Declarative Pipelines a hassle for us and either has us using non-ideal workarounds. There should just be a button for all pipeline jobs that says "re-pull from SCM". This should pull the Jenkinsfile from SCM and changes it from a default unparameterized build to a parameterized build but does not actually run a "build".
            Hide
            i_a_i Daniel Moore added a comment - - edited

            I am not familiar with the Jenkinsfile API, but as a workaround for declarative pipelines, would it work to create a decorator param-pipeline method and use it instead of the pipeline method? It could take the same arguments/script blocks/whatever as pipeline and look at the defined parameters and try to update the job accordingly. If the job already had the right parameter setup than it would just pass its arguments to the pipeline method. You would still have to run it twice, so not ideal.

            Show
            i_a_i Daniel Moore added a comment - - edited I am not familiar with the Jenkinsfile API, but as a workaround for declarative pipelines, would it work to create a decorator param-pipeline method and use it instead of the pipeline method? It could take the same arguments/script blocks/whatever as  pipeline and look at the defined parameters and try to update the job accordingly. If the job already had the right parameter setup than it would just pass its arguments to the  pipeline method. You would still have to run it twice, so not ideal.
            Hide
            kiruahxh Kiruahxh added a comment -

            Also, when using the "properties" step, either to declare parameters or set other options, the job options are still editable.
            It is flexible but very confusing : my colleagues could edit the jobs parameters without knowing that they are taken from the Jenkinsfile, then run the job and loose their work.

            Show
            kiruahxh Kiruahxh added a comment - Also, when using the "properties" step, either to declare parameters or set other options, the job options are still editable. It is flexible but very confusing : my colleagues could edit the jobs parameters without knowing that they are taken from the Jenkinsfile, then run the job and loose their work.
            jglick Jesse Glick made changes -
            Remote Link This issue links to "CloudBees Internal OSS-2103 (Web Link)" [ 18419 ]
            jglick Jesse Glick made changes -
            Remote Link This issue links to "CloudBees Internal OSS-2277 (Web Link)" [ 18360 ]
            jglick Jesse Glick made changes -
            Remote Link This issue links to "CloudBees-internal CD-559 (Web Link)" [ 20776 ]
            cloudbees CloudBees Inc. made changes -
            Remote Link This issue links to "CloudBees Internal OSS-2103 (Web Link)" [ 20783 ]
            Hide
            tom_ghyselinck Tom Ghyselinck added a comment -

            Hi all,

            More in general the parameters of the "previous" run are used.

            We have seen this when playing around with some kind of "dynamic value" for a parameter.
            We have a single (declarative) Pipeline which has a parameter to enable a DEBUG build.
            During the continuous CI builds, DEBUG mode is enabled, i.e. the `BUILD_DEBUG` parameter to false.
            On a nightly build, we set the `BUILD_DEBUG` parameter to false.

            pipeline {
            ...
                parameters {
                    booleanParam(
                        name: 'BUILD_DEBUG',
                        defaultValue: need_debug_build(currentBuild)
                    )
                }
            ...
            }
            

            We now see that:

            • the "nightly build" correctly sets the parameter to false
            • and the "daily builds" correctly set it to true.
            • but the parameter value is only applied in the next build !

            For example:

            • The "nightly build" still uses `DEBUG_BUILD=true`
            • While the first "daily build" uses `DEBUG_BUILD=false`

            I.e. When creating a new branch this applies too:
            (FYI: We use the Multibranch pipeline plugin on subversion repositories)
            The first build has no "previous build"and thus "no parameters"
            Which exactly explains what is seen on the first build of a (declarative) pipeline.

            In my opinion the `parameters` must be defined and applied on the current build.

            I hope this information is of any use for you!

            We will look for a workaround for now, but it would be great to see this fixed.
            Thank you in advance for the effort!

            With best regards,
            Tom.

            Show
            tom_ghyselinck Tom Ghyselinck added a comment - Hi all, More in general the parameters of the " previous " run are used. We have seen this when playing around with some kind of " dynamic value " for a parameter. We have a single (declarative) Pipeline which has a parameter to enable a DEBUG build. During the continuous CI builds, DEBUG mode is enabled, i.e. the `BUILD_DEBUG` parameter to false. On a nightly build, we set the `BUILD_DEBUG` parameter to false. pipeline { ... parameters { booleanParam( name: 'BUILD_DEBUG', defaultValue: need_debug_build(currentBuild) ) } ... } We now see that: the " nightly build " correctly sets the parameter to false and the " daily builds " correctly set it to true. but the parameter value is only applied in the next build ! For example: The " nightly build " still uses `DEBUG_BUILD=true` While the first " daily build " uses `DEBUG_BUILD=false` I.e. When creating a new branch this applies too: ( FYI: We use the Multibranch pipeline plugin on subversion repositories ) The first build has no " previous build "and thus " no parameters " Which exactly explains what is seen on the first build of a ( declarative ) pipeline. In my opinion the `parameters` must be defined and applied on the current build. I hope this information is of any use for you! We will look for a workaround for now, but it would be great to see this fixed. Thank you in advance for the effort! With best regards, Tom.
            Hide
            jglick Jesse Glick added a comment -

            the pipeline will then fail as the parameters will not be set

            No, you just need to use ${params.NAME} rather than ${NAME}. The latter loads a variable defined when the build started. The former will pick up current parameter definitions at the time the expression is used, which may be after properties has (re-)defined the job’s parameters list. For builds triggered by a PR event, this is fine, as there are no parameters coming from the environment—you are getting defaults.

            Show
            jglick Jesse Glick added a comment - the pipeline will then fail as the parameters will not be set No, you just need to use ${params.NAME } rather than ${NAME }. The latter loads a variable defined when the build started. The former will pick up current parameter definitions at the time the expression is used, which may be after properties has (re-)defined the job’s parameters list. For builds triggered by a PR event, this is fine, as there are no parameters coming from the environment—you are getting defaults.
            Hide
            asreekumar Adity Sreekumar added a comment - - edited

            When was the ${params.Name} syntax introduced? It does not seem to detect that when I use it in the following context:

            parameters{
             string (
             defaultValue: '1.0',
             description: 'Toolchain version',
             name : 'TOOLCHAIN')
            }
            steps{   
            checkout ($class: 'GitSCM', branches = [[name: '${params.TOOLCHAIN}']]])
            }
             
            

             If I use just TOOLCHAIN, it gives me the error reported earlier where it cannot find the environment variable.

            Show
            asreekumar Adity Sreekumar added a comment - - edited When was the ${params.Name}  syntax introduced? It does not seem to detect that when I use it in the following context: parameters{ string ( defaultValue: '1.0' , description: 'Toolchain version' , name : 'TOOLCHAIN' ) } steps{    checkout ($class: 'GitSCM' , branches = [[name: '${params.TOOLCHAIN}' ]]]) }    If I use just TOOLCHAIN, it gives me the error reported earlier where it cannot find the environment variable.
            Hide
            sverhoef Stefan Verhoeff added a comment - - edited

            Adity Sreekumar I think the issue in your code are the single quotes. Use double quotes instead to make variable interpolation work.

             

            [[name: "${params.TOOLCHAIN}"]]]
            
            Show
            sverhoef Stefan Verhoeff added a comment - - edited Adity Sreekumar I think the issue in your code are the single quotes. Use double quotes instead to make variable interpolation work.   [[name: "${params.TOOLCHAIN}"]]]
            Hide
            asreekumar Adity Sreekumar added a comment -

            That worked, thanks.

            Show
            asreekumar Adity Sreekumar added a comment - That worked, thanks.
            Hide
            ip1981 Igor Pashev added a comment - - edited

            There is a similar issue with triggers.

            I do not know how this works, but with declarative syntax the fix could be as simple as parsing the pipeline definition and picking up triggers and parameters.

            After all this works with good old XML configs that are perfectly declarative too.

            Show
            ip1981 Igor Pashev added a comment - - edited There is a similar issue with triggers. I do not know how this works, but with declarative syntax the fix could be as simple as parsing the pipeline definition and picking up triggers and parameters. After all this works with good old XML configs that are perfectly declarative too.
            Hide
            ip1981 Igor Pashev added a comment -

            At least I'd like it fixed for inline piplines (not in SCM repositories). Pipelines offer some useful feature like parallel steps, multiple SCM, etc. which are not available in old XML configs. This problem with triggers and parameters is definitely a regression.

            Show
            ip1981 Igor Pashev added a comment - At least I'd like it fixed for inline piplines (not in SCM repositories). Pipelines offer some useful feature like parallel steps, multiple SCM, etc. which are not available in old XML configs. This problem with triggers and parameters is definitely a regression.
            Hide
            jglick Jesse Glick added a comment -

            I'd like it fixed for inline piplines (not in SCM repositories).

            Just define the trigger, parameters, or other job properties directly on the job, as you would have for freestyle. This works for both inline scripts and scripts from SCM. You only need to use the properties step and thus encounter this issue when you are using multibranch projects (a.k.a. “Pipeline-as-Code”).

            Show
            jglick Jesse Glick added a comment - I'd like it fixed for inline piplines (not in SCM repositories). Just define the trigger, parameters, or other job properties directly on the job, as you would have for freestyle. This works for both inline scripts and scripts from SCM. You only need to use the properties step and thus encounter this issue when you are using multibranch projects (a.k.a. “Pipeline-as-Code”).
            jglick Jesse Glick made changes -
            Labels multibranch
            rodrigc Craig Rodrigues made changes -
            Link This issue is related to JENKINS-52939 [ JENKINS-52939 ]
            Hide
            ip1981 Igor Pashev added a comment -

            > Just define the trigger, parameters, or other job properties directly on the job, as you would have for freestyle. This works for both inline scripts and scripts from SCM

            Looks like it really works!

             

            Show
            ip1981 Igor Pashev added a comment - > Just define the trigger, parameters, or other job properties directly on the job, as you would have for freestyle. This works for both inline scripts and scripts from SCM Looks like it really works!  
            Hide
            ruhkopf Patrick Ruhkopf added a comment - - edited

            There seems to be a more critical issue when using parameterized, declarative pipelines within shared libraries and multi-branch projects. When there were modifications and a change is pushed from SCM, then the build fails immediately with the following exception:

            java.lang.IllegalArgumentException: Null value not allowed as an environment variable: APPLICATION
             at hudson.EnvVars.put(EnvVars.java:359)
             at hudson.model.StringParameterValue.buildEnvironment(StringParameterValue.java:59)
             at hudson.model.ParametersAction.buildEnvironment(ParametersAction.java:145)
             at hudson.model.Run.getEnvironment(Run.java:2365)
             at org.jenkinsci.plugins.workflow.job.WorkflowRun.getEnvironment(WorkflowRun.java:513)
             at org.jenkinsci.plugins.workflow.cps.CpsScmFlowDefinition.create(CpsScmFlowDefinition.java:106)
             at org.jenkinsci.plugins.workflow.multibranch.SCMBinder.create(SCMBinder.java:120)
             at org.jenkinsci.plugins.workflow.job.WorkflowRun.run(WorkflowRun.java:303) 

            When the pipeline is in this state, the only fix is to manually trigger it with "Run with parameters". I can't use any of the workarounds suggested here, because even when I update the Jenkinsfile to not call the shared pipeline and just print a single "echo hello world", it still fails right away. Any suggestions?

            Does this belong here in this issue or should I open a new one?

            Show
            ruhkopf Patrick Ruhkopf added a comment - - edited There seems to be a more critical issue when using parameterized, declarative pipelines within shared libraries and multi-branch projects. When there were modifications and a change is pushed from SCM, then the build fails immediately with the following exception: java.lang.IllegalArgumentException: Null value not allowed as an environment variable: APPLICATION at hudson.EnvVars.put(EnvVars.java:359) at hudson.model.StringParameterValue.buildEnvironment(StringParameterValue.java:59) at hudson.model.ParametersAction.buildEnvironment(ParametersAction.java:145) at hudson.model.Run.getEnvironment(Run.java:2365) at org.jenkinsci.plugins.workflow.job.WorkflowRun.getEnvironment(WorkflowRun.java:513) at org.jenkinsci.plugins.workflow.cps.CpsScmFlowDefinition.create(CpsScmFlowDefinition.java:106) at org.jenkinsci.plugins.workflow.multibranch.SCMBinder.create(SCMBinder.java:120) at org.jenkinsci.plugins.workflow.job.WorkflowRun.run(WorkflowRun.java:303) When the pipeline is in this state, the only fix is to manually trigger it with "Run with parameters". I can't use any of the workarounds suggested here, because even when I update the Jenkinsfile to not call the shared pipeline and just print a single "echo hello world", it still fails right away. Any suggestions? Does this belong here in this issue or should I open a new one?
            Hide
            abayer Andrew Bayer added a comment -

            Patrick Ruhkopf - open a new ticket and make sure to include your full reproduction case. I assume you don't have a default value set for the parameter in question?

            Show
            abayer Andrew Bayer added a comment - Patrick Ruhkopf - open a new ticket and make sure to include your full reproduction case. I assume you don't have a default value set for the parameter in question?
            Hide
            famod Falko Modler added a comment -

            I am also affected by this:

            • Jenkinsfile from SCM
            • string parameter with defaultValue
            • value is accessed via ${params.[...]}
            • changed defaultValue is only picked up in the second build, not right away
            Show
            famod Falko Modler added a comment - I am also affected by this: Jenkinsfile from SCM string parameter with defaultValue value is accessed via ${params. [...] } changed defaultValue is only picked up in the second build, not right away
            Hide
            ifernandezcalvo Ivan Fernandez Calvo added a comment - - edited

            The workaround seems not valid on the latest version of declarative pipeline, even do you use params to declare the environment variable the variable is not defined

            • Jenkins core 2.153
            • Declarative Pipeline 1.3.3

            Steps to replicate the issue:

            • Create a Jenkinsfile like the following in a repo
            • Create a multibranch pipeline that uses this repo and this Jenkinsfile
            • Create a PR
            • Check the logs of the pipeline and you could see the issue, the variable is not defined on parallel stages
            #!/usr/bin/env groovy
            
            pipeline {
              agent none
              options {
                timeout(time: 1, unit: 'HOURS')
                buildDiscarder(logRotator(numToKeepStr: '20', artifactNumToKeepStr: '20', daysToKeepStr: '30'))
                timestamps()
                ansiColor('xterm')
                disableResume()
                durabilityHint('PERFORMANCE_OPTIMIZED')
              }
              parameters {
                string(name: 'GO_VERSION', defaultValue: "1.10.3", description: "Go version to use.")
              }
              stages {
                stage('Initializing'){
                  agent { label 'linux && immutable' }
                  options { skipDefaultCheckout() }
                  environment {
                    GO_VERSION = "${params.GO_VERSION}"
                  }
                  stages {
                    stage('It works') {
                      steps {
                        sh "echo '${GO_VERSION}'"
                      }
                    }
                  }
                  stage('Test') {
                    failFast true
                    parallel {
                      stage('Fail 01') {
                        steps {
                          sh "echo '${GO_VERSION}'"
                        }
                      }
                      stage('Fail 02') {
                        steps {
                          sh "echo '${GO_VERSION}'"
                        }
                      }
                    }
                }
              }
            }
            

            This works

            pipeline {
              agent none
              options {
                timeout(time: 1, unit: 'HOURS')
                buildDiscarder(logRotator(numToKeepStr: '20', artifactNumToKeepStr: '20', daysToKeepStr: '30'))
                timestamps()
                ansiColor('xterm')
                disableResume()
                durabilityHint('PERFORMANCE_OPTIMIZED')
              }
              parameters {
                string(name: 'GO_VERSION', defaultValue: "1.10.3", description: "Go version to use.")
              }
              stages {
                stage('Initializing'){
                  agent { label 'linux && immutable' }
                  options { skipDefaultCheckout() }
                  environment {
                    GO_VERSION = "${params.GO_VERSION}"
                  }
                  stages {
                    stage('It works') {
                      steps {
                        sh "echo '${GO_VERSION}'"
                      }
                    }
                  }
                  stage('Test') {
                    failFast true
                    parallel {
                      stage('Fail 01') {
                  environment {
                    GO_VERSION = "${params.GO_VERSION}"
                  }
                        steps {
                          sh "echo '${GO_VERSION}'"
                        }
                      }
                      stage('Fail 02') {
                  environment {
                    GO_VERSION = "${params.GO_VERSION}"
                  }
                        steps {
                          sh "echo '${GO_VERSION}'"
                        }
                      }
                    }
                }
              }
            }
            
            Show
            ifernandezcalvo Ivan Fernandez Calvo added a comment - - edited The workaround seems not valid on the latest version of declarative pipeline, even do you use params to declare the environment variable the variable is not defined Jenkins core 2.153 Declarative Pipeline 1.3.3 Steps to replicate the issue: Create a Jenkinsfile like the following in a repo Create a multibranch pipeline that uses this repo and this Jenkinsfile Create a PR Check the logs of the pipeline and you could see the issue, the variable is not defined on parallel stages #!/usr/bin/env groovy pipeline { agent none options { timeout(time: 1, unit: 'HOURS' ) buildDiscarder(logRotator(numToKeepStr: '20' , artifactNumToKeepStr: '20' , daysToKeepStr: '30' )) timestamps() ansiColor( 'xterm' ) disableResume() durabilityHint( 'PERFORMANCE_OPTIMIZED' ) } parameters { string(name: 'GO_VERSION' , defaultValue: "1.10.3" , description: "Go version to use." ) } stages { stage( 'Initializing' ){ agent { label 'linux && immutable' } options { skipDefaultCheckout() } environment { GO_VERSION = "${params.GO_VERSION}" } stages { stage( 'It works' ) { steps { sh "echo '${GO_VERSION}' " } } } stage( 'Test' ) { failFast true parallel { stage( 'Fail 01' ) { steps { sh "echo '${GO_VERSION}' " } } stage( 'Fail 02' ) { steps { sh "echo '${GO_VERSION}' " } } } } } } This works pipeline { agent none options { timeout(time: 1, unit: 'HOURS' ) buildDiscarder(logRotator(numToKeepStr: '20' , artifactNumToKeepStr: '20' , daysToKeepStr: '30' )) timestamps() ansiColor( 'xterm' ) disableResume() durabilityHint( 'PERFORMANCE_OPTIMIZED' ) } parameters { string(name: 'GO_VERSION' , defaultValue: "1.10.3" , description: "Go version to use." ) } stages { stage( 'Initializing' ){ agent { label 'linux && immutable' } options { skipDefaultCheckout() } environment { GO_VERSION = "${params.GO_VERSION}" } stages { stage( 'It works' ) { steps { sh "echo '${GO_VERSION}' " } } } stage( 'Test' ) { failFast true parallel { stage( 'Fail 01' ) { environment { GO_VERSION = "${params.GO_VERSION}" } steps { sh "echo '${GO_VERSION}' " } } stage( 'Fail 02' ) { environment { GO_VERSION = "${params.GO_VERSION}" } steps { sh "echo '${GO_VERSION}' " } } } } } }
            Hide
            sbussetti steve bussetti added a comment -

            Has there been any traction on this?  Honestly if someone could point me at the code responsible for reading the Jenkinsfile from the scm I could write a simple plugin that just adds a "Refresh Job Definition" button to the sidebar of a job.

            Most of the folks I see run into this are less concerned with the first-time triggered build than they are about being unable to modify new parameters after updating a particular job which means they're manually executing a Pipeline.  

            Show
            sbussetti steve bussetti added a comment - Has there been any traction on this?  Honestly if someone could point me at the code responsible for reading the Jenkinsfile from the scm I could write a simple plugin that just adds a "Refresh Job Definition" button to the sidebar of a job. Most of the folks I see run into this are less concerned with the first-time triggered build than they are about being unable to modify new parameters after updating a particular job which means they're manually executing a Pipeline.  
            Hide
            hkawashi Hideaki Kawashima added a comment - - edited

            Is this bug known to the person in charge ?
            Are you already working on countermeasures ?

            The specification that can specify the default value in parameters block is broken. Various provisional workarounds are listed, but they will fail immediately.
            As mentioned above, re-definition in the environment block is not possible in parallel execution. Script compilation fails if the number of parameters defined in the parameters block is large or the script is slightly larger. Expecting a first build with default settings defined in parameters block for post-commit processing, but we have to go to the build manually after failed. (This breaks automate execution for build pipeline)

            As you know, there are many different builds in the CI environment for post-commit processing. Rerunning them one by one manually will greatly impair the significance of the existence of a CI environment that is designed to reduce time and effort.

            I'd like to see clear countermeasures or workarounds soon.
            Thank you.

            Show
            hkawashi Hideaki Kawashima added a comment - - edited Is this bug known to the person in charge ? Are you already working on countermeasures ? The specification that can specify the default value in parameters block is broken. Various provisional workarounds are listed, but they will fail immediately. As mentioned above, re-definition in the environment block is not possible in parallel execution. Script compilation fails if the number of parameters defined in the parameters block is large or the script is slightly larger. Expecting a first build with default settings defined in parameters block for post-commit processing, but we have to go to the build manually after failed. (This breaks automate execution for build pipeline) As you know, there are many different builds in the CI environment for post-commit processing. Rerunning them one by one manually will greatly impair the significance of the existence of a CI environment that is designed to reduce time and effort. I'd like to see clear countermeasures or workarounds soon. Thank you.
            slonopotamusorama Marat Radchenko made changes -
            Link This issue duplicates JENKINS-40574 [ JENKINS-40574 ]

              People

              • Assignee:
                abayer Andrew Bayer
                Reporter:
                jglick Jesse Glick
              • Votes:
                103 Vote for this issue
                Watchers:
                114 Start watching this issue

                Dates

                • Created:
                  Updated: