Uploaded image for project: 'Jenkins'
  1. Jenkins
  2. JENKINS-49309

All SSH slaves unexpectedly disconnect when one job finishes

XMLWordPrintable

    • Icon: Bug Bug
    • Resolution: Cannot Reproduce
    • Icon: Major Major
    • ssh-slaves-plugin
    • None

      Using SSH based slaves, if you are running two or more agents on a slave, and have concurrent builds running, all of the jobs will unexpectedly fail due to SSH disconnections when one of them finishes. 

       Example of job that was running when another finished. 

      Parameter C_MEMSTYLE bound to: 2 - type: integer 
       Parameter C_OPTIMIZATION bound to: 2 - type: integer 
       Parameter C_MEM_INIT_PREFIX bound to: MainDesign_rs_encoder_0_0 - type: string 
       Parameter C_ELABORATION_DIR bound to: ./ - type: string 
       Parameter C_XDEVICEFAMILY bound to: kintex7 - type: string 
       Parameter C_FAMILY bound to: kintex7 - type: string 
       Connection to 127.0.0.1 closed by remote host.
       [Pipeline] }
       [Pipeline] // script
       [Pipeline] }
       [Pipeline] // withEnv
       [Pipeline] }
       [Pipeline] // stage
       [Pipeline] stage
       [Pipeline] { (Deployment)
       Stage 'Deployment' skipped due to earlier failure(s)
      

      This is persistent and happens regularly.
      I've tried making two slave with one agent each (that point to the same physical slave) but the problem persists.

      This is an issue for us as builds take 3hrs on high powered machines and it's not feasible to run them one after another, we need parallel.

      Jenkins ver. 2.73.3
      ssh slaves plugin 1.24

      Attached screenshot off basic SSH slave setup.

            ifernandezcalvo Ivan Fernandez Calvo
            dgonano Dion Gonano
            Votes:
            0 Vote for this issue
            Watchers:
            4 Start watching this issue

              Created:
              Updated:
              Resolved: