Uploaded image for project: 'Jenkins'
  1. Jenkins
  2. JENKINS-23706

Plugin Memory Leak : OutOfMemoryError : PermGen space

    Details

    • Type: Bug
    • Status: Resolved (View Workflow)
    • Priority: Major
    • Resolution: Incomplete
    • Component/s: disk-usage-plugin
    • Environment:
      -Xmx768m -XX:+CMSClassUnloadingEnabled -XX:+HeapDumpOnOutOfMemoryError -XX:HeapDumpPath=/var/log/jenkins/memory.dump -XX:+UseCompressedOops -Djava.awt.headless=true
    • Similar Issues:

      Description

      I have a situation where I am running out of permgen space on a regular basis. I can literally what the permgen use grow until the server falls over. I have seen numerous other ticket that point toward this actually being a plugin issue, and that the only way to troubleshoot this is with a Memory Dump. I am of course able to increase the amount of PermGen, but all this does is delay the inevitable. No matter what levels I set permgen to, it eventually falls over.

      I have attached a memory dump.

      WARNING: Disk usage plugin fails during build calculation disk space of job **-*********-config-test
      java.io.IOException: remote file operation failed: /var/lib/jenkins/workspace/**-*********-config-test at hudson.remoting.Channel@236ff568:WSCPSLRDREAPP15
      at hudson.FilePath.act(FilePath.java:916)
      at hudson.FilePath.act(FilePath.java:893)
      at hudson.FilePath.exists(FilePath.java:1325)
      at hudson.plugins.disk_usage.DiskUsageUtil.calculateWorkspaceDiskUsageForPath(DiskUsageUtil.java:292)
      at hudson.plugins.disk_usage.DiskUsageBuildListener.onCompleted(DiskUsageBuildListener.java:60)
      at hudson.plugins.disk_usage.DiskUsageBuildListener.onCompleted(DiskUsageBuildListener.java:23)
      at hudson.model.listeners.RunListener.fireCompleted(RunListener.java:199)
      at hudson.model.Run.execute(Run.java:1783)
      at hudson.maven.MavenModuleSetBuild.run(MavenModuleSetBuild.java:529)
      at hudson.model.ResourceController.execute(ResourceController.java:88)
      at hudson.model.Executor.run(Executor.java:234)
      Caused by: java.io.IOException: Remote call on WSCPSLRDREAPP15 failed
      at hudson.remoting.Channel.call(Channel.java:748)
      at hudson.FilePath.act(FilePath.java:909)
      ... 10 more
      Caused by: java.lang.OutOfMemoryError: PermGen space

        Attachments

          Issue Links

            Activity

            Hide
            dave_hoffman David Hoffman added a comment -

            The memory dump I tried to attach was too large. It can be downloaded from https://dl.dropboxusercontent.com/u/8866678/memory.dump.bz2

            Show
            dave_hoffman David Hoffman added a comment - The memory dump I tried to attach was too large. It can be downloaded from https://dl.dropboxusercontent.com/u/8866678/memory.dump.bz2
            Hide
            greg Greg horvath added a comment -

            Are you using the TAP plugin? That thing is a beast; we just figured out that it was the cause of multiple Jenkins instances grinding to a halt over the past few days.

            Show
            greg Greg horvath added a comment - Are you using the TAP plugin? That thing is a beast; we just figured out that it was the cause of multiple Jenkins instances grinding to a halt over the past few days.
            Hide
            danielbeck Daniel Beck added a comment -

            To clarify, you did set e.g. -XX:MaxPermSize=256M but it didn't help for long?

            FWIW I'd get rid of Global Build Stats. (Won't help for permgen, but still...)

            Show
            danielbeck Daniel Beck added a comment - To clarify, you did set e.g. -XX:MaxPermSize=256M but it didn't help for long? FWIW I'd get rid of Global Build Stats. (Won't help for permgen, but still...)
            Hide
            lvotypkova Lucie Votypkova added a comment -

            I was not able to determine the cause of OOM so I can not exclude disk-usage, but I think that it is not the root cause. If the common usage of disk-usage causes that error, I think that more people would have the same problem. I can not recognize the root cause from dump and your log only proves that disk-usage took the last byte, but not who took the most. Please, do you still have this problem? Which version of Jenkins you used?

            Show
            lvotypkova Lucie Votypkova added a comment - I was not able to determine the cause of OOM so I can not exclude disk-usage, but I think that it is not the root cause. If the common usage of disk-usage causes that error, I think that more people would have the same problem. I can not recognize the root cause from dump and your log only proves that disk-usage took the last byte, but not who took the most. Please, do you still have this problem? Which version of Jenkins you used?
            Hide
            danielbeck Daniel Beck added a comment -

            Lucie Votypkova This issue is almost a year old with no responses by the reporter, I think this can safely be resolved as Incomplete. It doesn't show an issue in Disk Usage Plugin after all.

            Show
            danielbeck Daniel Beck added a comment - Lucie Votypkova This issue is almost a year old with no responses by the reporter, I think this can safely be resolved as Incomplete. It doesn't show an issue in Disk Usage Plugin after all.

              People

              • Assignee:
                Unassigned
                Reporter:
                dave_hoffman David Hoffman
              • Votes:
                0 Vote for this issue
                Watchers:
                4 Start watching this issue

                Dates

                • Created:
                  Updated:
                  Resolved: