I believe this issue extends into a more serious issue, and I haven't found a pattern that works with Jenkins yet... This issue applies to "Multibranch" pipeline jobs, but also to simple "Pipeline" jobs.
Imagine a simple publish job that is used by many different repositories. This job is called from other jobs. In order to provide the most reproducible solution, the other job must provide the commit id of the generic jenkinsfile to use (e.g.: $COMMIT = v2.4). This is then used in the "Clone" properties of the job config (through the UI) as "Branch to checkout: $COMMIT".
This works well: Jenkins receives $COMMIT, and uses that to checkout "a jenkinsfile from the past" and process it.
This works well, as long as you don't add/remove properties
If I released a new v2.5 with a new property, I cannot support the v2.4 properties because then, when other jobs calls me with 2.4 or before, the new property will be deleted. This affects both the "Build with Parameter" method, AND through `build job: job_name` from another jenkinsfile.
I can think of 2 workarounds:
- Create a new job with a new name each time the properties change
- Leverage the multibranch pipeline so that it discovers a special branch prefix like `jenkins/` that you use to publish new versions of the jenkinsfile
None of my workaround fix the "first time run" problem, but the real problem here, is that you can't call 100% reproducible builds is jenkins has to rely on some artifact from the previous build in order to work. This is probably the root of all evils...?
Is there a more general issue that I should subscribe to in order to monitor the progress on the chicken & egg problem?