-
Bug
-
Resolution: Unresolved
-
Minor
-
None
Hello Everyone,
Due to size of pipeline scripts (more than 700 lines ), it is not easy for me to update/edit the pipeline inside the default pipeline editor ( very simple and small space to write script ). So i uploaded the scripts to Bitbucket and reconfigured the pipeline to use "Pipeline script from SCM". but I encountered 2 issues which are kind of blocker item for me at this moment:
1) i realized that for each stage, the jenkins tries to check out from Bitbucket which slows down the entire pipeline ( because in each pipeline, there are more than 10 stages ). I can not consider it as a major problem or blocker item. but at least it can be considered as performance issue
2) a few stages ( section ) in my pipeline should be run only on remote node ( slave node ) to complete the code installation/release ( let's say yum install MyPackage), the issue is that these remote nodes do NOT have access to Bitbucket (technically they should NOT have access to internet or vice versa) and the jenkins can not check out and eventually the stage would fail.
I can't seem to find any options to configure the jenkins so it can only fetch/checkout the pipeline script only on Master. any idea on how to resolve this issue ?