Uploaded image for project: 'Jenkins'
  1. Jenkins
  2. JENKINS-14473

Jenkins server memory leak

    XMLWordPrintable

    Details

    • Type: Bug
    • Status: Resolved (View Workflow)
    • Priority: Major
    • Resolution: Fixed
    • Component/s: core
    • Environment:
      Ubuntu 7.10 running on 32bit dual core. Jenkins with Winstone. Java version "1.6.0_03", Java(TM) SE Runtime Environment (build 1.6.0_03-b05), Java HotSpot(TM) Server VM (build 1.6.0_03-b05, mixed mode)
    • Similar Issues:

      Description

      I am not sure I have filed this under the right component.

      We are seeing steady gradual increases in memory usage for our Jenkins (see attached graphs). It does not result in the OutOfMemoryError but usually results in Jenkins utilizing 100% of CPU (one core in our case).

      Attached is a memory histogram report and the link include (see URL above) is to a heap dump generated using the Monitoring Plugin.

      This happens consistently and our work around for now is to restart Jenkins each time it gets too bloated. If this doesn't get fixed, we will have to stop using Winstone and use Jetty - because there seems to be a connection between the servlet wrapper (winstone/jetty) and the leak - pretty sure we did not see this when using Jetty.

        Attachments

          Activity

          akarollil Anoop Karollil created issue -
          akarollil Anoop Karollil made changes -
          Field Original Value New Value
          Description I am not sure I have filed this under the right component.

          We are seeing steady gradual increases in memory usage for our Jenkins (see attached graphs). It does not result in the OutOfMemoryError but usually results in Jenkins utilizing 100% of CPU (one core in our case).

          Attached is a memory histogram report and heap dump generated using the Monitoring Plugin.

          This happens consistently and our work around for now is to restart Jenkins each time it gets too bloated. If this doesn't get fixed, we will have to stop using Winstone and use Jetty - because there seems to be a connection between the servlet(?) wrapper (winstone/jetty) and the leak - pretty sure we did not see this when using Jetty.
          I am not sure I have filed this under the right component.

          We are seeing steady gradual increases in memory usage for our Jenkins (see attached graphs). It does not result in the OutOfMemoryError but usually results in Jenkins utilizing 100% of CPU (one core in our case).

          Attached is a memory histogram report and the link include (see URL above) is to a heap dump generated using the Monitoring Plugin.

          This happens consistently and our work around for now is to restart Jenkins each time it gets too bloated. If this doesn't get fixed, we will have to stop using Winstone and use Jetty - because there seems to be a connection between the servlet(?) wrapper (winstone/jetty) and the leak - pretty sure we did not see this when using Jetty.
          Hide
          akarollil Anoop Karollil added a comment -

          Did some analysis using Eclipse's Memory Analyzer Tool with the heap dump attached. Report zip files and a PDF attached.

          Show
          akarollil Anoop Karollil added a comment - Did some analysis using Eclipse's Memory Analyzer Tool with the heap dump attached. Report zip files and a PDF attached.
          akarollil Anoop Karollil made changes -
          Attachment heapdump_Leak_Suspects.zip [ 22229 ]
          Attachment heapdump_Top_Components.zip [ 22230 ]
          Attachment heapdump_Top_Consumers.zip [ 22231 ]
          Attachment heapdump_top_consumers_eclipse_memory_analyser.pdf [ 22232 ]
          akarollil Anoop Karollil made changes -
          Environment Ubuntu 7.10 running on 32bit dual core. Jenkins with Winstone. Ubuntu 7.10 running on 32bit dual core. Jenkins with Winstone. Java version "1.6.0_03", Java(TM) SE Runtime Environment (build 1.6.0_03-b05), Java HotSpot(TM) Server VM (build 1.6.0_03-b05, mixed mode)
          Hide
          liya Liya Katz added a comment -

          Upgrading to 1.492 (from 1.473) has solved this problem for us.

          Show
          liya Liya Katz added a comment - Upgrading to 1.492 (from 1.473) has solved this problem for us.
          Hide
          akarollil Anoop Karollil added a comment -

          Thanks Liya. I did see a few weeks ago (memory usage crept up to 5gigs of RAM), but things have been okay the past week with 1.491 - upgraded to 1.492 anyway. Hopefully this has been resolved - I will close this bug once I am sure.

          Show
          akarollil Anoop Karollil added a comment - Thanks Liya. I did see a few weeks ago (memory usage crept up to 5gigs of RAM), but things have been okay the past week with 1.491 - upgraded to 1.492 anyway. Hopefully this has been resolved - I will close this bug once I am sure.
          Hide
          evernat evernat added a comment -

          @Anoop
          Any news?
          Is it reproduced recently?

          Show
          evernat evernat added a comment - @Anoop Any news? Is it reproduced recently?
          Hide
          akarollil Anoop Karollil added a comment -

          No, I think this has been fixed.I do not see it in v1.538

          Show
          akarollil Anoop Karollil added a comment - No, I think this has been fixed.I do not see it in v1.538
          akarollil Anoop Karollil made changes -
          Status Open [ 1 ] Resolved [ 5 ]
          Resolution Fixed [ 1 ]
          rtyler R. Tyler Croy made changes -
          Workflow JNJira [ 145109 ] JNJira + In-Review [ 191358 ]

            People

            • Assignee:
              Unassigned
              Reporter:
              akarollil Anoop Karollil
            • Votes:
              2 Vote for this issue
              Watchers:
              8 Start watching this issue

              Dates

              • Created:
                Updated:
                Resolved: