S3 Plugin

Skip to end of metadata
Go to start of metadata

Plugin Information

Plugin ID s3 Changes In Latest Release
Since Latest Release
Latest Release
Latest Release Date
Required Core
Dependencies
0.7 (archives)
Dec 19, 2014
1.526
maven-plugin (version:1.526)
copyartifact (version:1.21)
Source Code
Issue Tracking
Pull Requests
Maintainer(s)
GitHub
Open Issues
Pull Requests
Doug MacEachern (id: dougm)
Richard Dallaway (id: d6y)
Long Ho (id: longlho)
Michael Watt (id: mikewatt)
David Beer (id: dmbeer)
Usage Installations 2014-May 1662
2014-Jun 1676
2014-Jul 1797
2014-Aug 1902
2014-Sep 2040
2014-Oct 2170
2014-Nov 2200
2014-Dec 2224
2015-Jan 2377
2015-Feb 2486
2015-Mar 2639
2015-Apr 2712

Upload build artifacts to Amazon S3

Making artifacts public

If you'd like to have all of your artifacts be publicly downloadable, see http://ariejan.net/2010/12/24/public-readable-amazon-s3-bucket-policy/.

Usage with IAM

If you used IAM to create a separate pair of access credentials for this plugin, you can lock down its AWS access to simply listing buckets and writing to a specific bucket. Add the following custom policy to the user in the IAM console, replacing occurrences of "my-artifact-bucket" with your bucket name, which you'll have to create first:

{
  "Statement": [
    {
      "Action": [
        "s3:ListAllMyBuckets"
      ],
      "Effect": "Allow",
      "Resource": "arn:aws:s3:::*"
    },
    {
      "Action": "s3:*",
      "Effect": "Allow",
      "Resource": ["arn:aws:s3:::my-artifact-bucket", "arn:aws:s3:::my-artifact-bucket/*"]
    }
  ]
}

Version History

Version 0.5 (Aug 09, 2013)

  • Added Regions Support (issue #18839)
  • Update AWS SDK to latest version

Version 0.4 (Jul 12, 2013)

  • Added storage class support
  • Added arbitrary metadata support
  • Fixed the problem where the plugin messes up credential profiles upon concurrent use (issue #14470)
  • Plugin shouldn't store S3 password in clear (issue #14395)

Version 0.3.1 (Sept. 20th, 2012)

  • Prevent OOME when uploading large files.
  • Update Amazon SDK

Version 0.3.0 (May 29th, 2012)

  • Use AWS MimeType library to determine the Content-Type of the uploaded file.

Labels

plugin-upload plugin-upload Delete
Enter labels to add to this page:
Please wait 
Looking for a label? Just start typing.
  1. Jan 10, 2012

    Michael Rooney says:

    FYI, it says the required Jenkins core version is 1.434, but I've built it fine ...

    FYI, it says the required Jenkins core version is 1.434, but I've built it fine with 1.424 and have it running successfully on Jenkins 1.424.1 LTS.

  2. May 01, 2012

    mshields says:

    Is it possible to publish an entire folder recursively?

    Is it possible to publish an entire folder recursively?

    1. Dec 09, 2012

      ringerc says:

      You can create an archive with a command line tool like `zip -r` or `tar cvz...

      You can create an archive with a command line tool like `zip -r` or `tar cvzf` then publish that. Alternately, use the ant wildcard **.

  3. Aug 08, 2012

    milangs says:

    How can I get the build artifacts to go inside the S3 bucket within a subfolder ...

    How can I get the build artifacts to go inside the S3 bucket within a subfolder named with the build name or a date time stamp? Currently the artifacts just go into the root of the S3 bucket and overwrite the previous builds.
    Also, I noticed that Bucket is not create
    How can I get the build artifacts to go inside the S3 bucket within a subfolder named with the build number or a date time stamp? Currently the artifacts just go into the root of the S3 bucket and overwrite the previous builds.

    Also, I noticed that when you enter the bucket name on the Job Configuration page, the help text says that the bucket will be created if it does not exist. However, this caused a build failure as an exception was raised due to the bucket not existing. I had to manually create this bucket before the build succeeded.

    1. Dec 09, 2012

      ringerc says:

      I'm publishing stamped builds by creating the local file with the desired name b...

      I'm publishing stamped builds by creating the local file with the desired name before uploading. For example:

      tar cvzf "regress_install-$BUILD_TAG.tar.gz" regress_install

      You can also use the output of the `date` command, other Jenkins-set env vars, etc.

      It'd be nice if the S3 plugin allowed you to specify a destination file name or name prefix, but this works well enough.

  4. Oct 15, 2012

    mako says:

    Hi, Last time I have a problem with this plugin, it happens when upload task st...

    Hi,

    Last time I have a problem with this plugin, it happens when upload task started by timer and artifact is bigger than 124Mb

    Microsoft Windows [Version 6.1.7601]
    Copyright (c) 2009 Microsoft Corporation.  All rights reserved.
    
    C:\Users\mako>echo %JAVA_OPTS%
    -Djava.awt.headless=true \-Xms300m \-Xmx600m \-XX:PermSize=256m \-XX:MaxPermSize=512m \-XX:+DisableExplicitGC
    FATAL: Java heap space
    java.lang.OutOfMemoryError: Java heap space
    	at org.apache.http.util.ByteArrayBuffer.expand(ByteArrayBuffer.java:62)
    	at org.apache.http.util.ByteArrayBuffer.append(ByteArrayBuffer.java:92)
    	at org.apache.http.util.EntityUtils.toByteArray(EntityUtils.java:102)
    	at org.apache.http.entity.BufferedHttpEntity.<init>(BufferedHttpEntity.java:62)
    	at com.amazonaws.http.HttpRequestFactory.newBufferedHttpEntity(HttpRequestFactory.java:246)
    	at com.amazonaws.http.HttpRequestFactory.createHttpRequest(HttpRequestFactory.java:122)
    	at com.amazonaws.http.AmazonHttpClient.executeHelper(AmazonHttpClient.java:224)
    	at com.amazonaws.http.AmazonHttpClient.execute(AmazonHttpClient.java:166)
    	at com.amazonaws.services.s3.AmazonS3Client.invoke(AmazonS3Client.java:2198)
    	at com.amazonaws.services.s3.AmazonS3Client.putObject(AmazonS3Client.java:958)
    	at com.amazonaws.services.s3.AmazonS3Client.putObject(AmazonS3Client.java:843)
    	at hudson.plugins.s3.S3Profile.upload(S3Profile.java:75)
    	at hudson.plugins.s3.S3BucketPublisher.perform(S3BucketPublisher.java:119)
    	at hudson.tasks.BuildStepMonitor$2.perform(BuildStepMonitor.java:27)
    	at hudson.model.AbstractBuild$AbstractBuildExecution.perform(AbstractBuild.java:717)
    	at hudson.model.AbstractBuild$AbstractBuildExecution.performAllBuildSteps(AbstractBuild.java:692)
    	at hudson.model.Build$BuildExecution.post2(Build.java:183)
    	at hudson.model.AbstractBuild$AbstractBuildExecution.post(AbstractBuild.java:639)
    	at hudson.model.Run.execute(Run.java:1509)
    	at hudson.model.FreeStyleBuild.run(FreeStyleBuild.java:46)
    	at hudson.model.ResourceController.execute(ResourceController.java:88)
    	at hudson.model.Executor.run(Executor.java:236)
    1. Dec 09, 2012

      ringerc says:

      Looks like the S3 plugin isn't bright enough to stream the artifact progressivel...

      Looks like the S3 plugin isn't bright enough to stream the artifact progressively; it must be loading the whole lot into RAM at once. Consider patching it to use a read/write loop or stream copying and submit the patch to Jenkins' JIRA.

  5. Dec 09, 2012

    ringerc says:

    Unlike most of the other artifact publishers, the S3 publisher doesn't seem to a...

    Unlike most of the other artifact publishers, the S3 publisher doesn't seem to attach a link to the artifact to the build record for easy retrieval.

    I'm considering implementing that, but I'd like to know if anyone else has done it first, or if there's some reason I'm not aware of that'd make it harder than would be expected. I'm quite new to Jenkins and very new to plugin development in Jenkins, so I'd love some pointers.

    What I'm thinking of doing if it's possible is storing the S3 bucket and object ID in the build record, and generating a signed URL to the object on demand whenever the build page is displayed. If I can't generate the URL on the fly when the build page is displayed I'd instead generate and store a long-expiry signed URL when the artifact is uploaded.

  6. Feb 28, 2013

    jillrochelle says:

    I've installed the S3 Plugin in Hudson to copy a war files that will be used for...

    I've installed the S3 Plugin in Hudson to copy a war files that will be used for deployments to S3 buckets.

    I've set up 2 different S3 profiles in Hudson, one for production and one for test (2 different AWS accounts).

    My instance of hudson is running on an EC2 instance inside the Test AWS account.

    Inside the build for the project I've indicated to use my production profile.

    The copy from Hudson to S3 will fail due to access denied. Unless I give the bucket permission to the Test AWS account. But then, the object in the bucket does not have the correct permissions for the Production account to get the object out of the bucket to use.

    I thought that Hudson would use the keys provided in the S3 profile for authorization for the copy to the bucket but it doesn't appear that way.

    I know I could just use the Test account keys in the Production environment to get the object, but I was hoping to keep the keys contained to just that single environment and not have to do any cross authorization or usage.

    Any thoughts anyone? 

  7. Mar 15, 2013

    grayaii says:

    So if  understand this correctly, there is no way that this plugin can uplo...

    So if  understand this correctly, there is no way that this plugin can upload a folder, and all it's sub-directories to the S3 while preserving the directory structure?

    In other words, if I have this:
    $WORKSPACE/foo/bar/index.html

    and I want to copy "foo" and all its sub-directories to S3, so that it looks exactly like it does on in my workspace, this plugin can NOT do this?

    Thanks!

    FYI, it looks like the answer is "No", according to this StackOverflow question: http://stackoverflow.com/questions/5407742/how-can-i-publish-static-web-resources-to-amazon-s3-using-hudson-jenkins-and-mav

  8. Sep 11, 2013

    Dave Johnston says:

    Looks like the latest version of the plugin (0.5) requires Jenkins core version&...

    Looks like the latest version of the plugin (0.5) requires Jenkins core version 1.526.

    Has anyone built or tested this with the LTS version of Jenkins ? 1.509.3 ?

    Cheers

  9. Mar 28, 2014

    scolestock says:

    There is a problem, I believe, with how this plugin interacts with the Promoted ...

    There is a problem, I believe, with how this plugin interacts with the Promoted Builds plugin.  I'm not sure which plugin would be the cause of the this issue.  I reported this on the google group as well, but here it is:

    I'm finding that if you add a "publish to s3" step to a build promotion process - and you have multiple build promotion processes defined (say, one for Dev, Test, and Production) - you get a very strange interaction.

    I wanted to publish to a different S3 bucket for each of Dev, Test, and Production - and wanted to wire that into the three different build promotion definitions.

    However, upon Save, the configuration got very strange:  Every build promotion process I had defined now had every S3 step from all promotions.

    In other words I had defined:
    Dev Promo
        Publish to s3 dev bucket
    Test Promo
        Publish to s3 test bucket
    Prod Promo
        Publish to s3 prod bucket

    but upon Save it became:

    Dev Promo
        Publish to s3 dev bucket
        Publish to s3 test bucket
        Publish to s3 prod bucket
    Test Promo
        Publish to s3 dev bucket
        Publish to s3 test bucket
        Publish to s3 prod bucket
    Prod Promo
        Publish to s3 dev bucket
        Publish to s3 test bucket
        Publish to s3 prod bucket

    and every subsequent save actually multiplied the S3 configs.

    Happy to provide more information if I can.
    Scott

  10. Dec 03

    neil says:

    I've been using the plugin successfully to upload to an S3 bucket in the US_WEST...

    I've been using the plugin successfully to upload to an S3 bucket in the US_WEST_1 region.  I tried to use the plugin with another project to upload to a bucket in the US_WEST_2 region and I'm getting the exception copied below.  Wonder if it is related to this issue: https://issues.jenkins-ci.org/browse/JENKINS-18839.  We're using v0.6 of the plugin and v1.562 of Jenkins.  Anyone else have any experience uploading to US_WEST_2?  Publish artifacts to S3 Bucket Using S3 profile: API
    Publish artifacts to S3 Bucket bucket=deployment-artifacts, file=ROOT.war region=US_WEST_2, upload from slave=false managed=false
    ERROR: Failed to upload files
    java.io.IOException: put Destination [bucketName=deployment-artifacts, objectName=ROOT.war]: com.amazonaws.services.s3.model.AmazonS3Exception: The bucket you are attempting to access must be addressed using the specified endpoint. Please send all future requests to this endpoint. (Service: Amazon S3; Status Code: 301; Error Code: PermanentRedirect; Request ID: 7703E745610E74FA), S3 Extended Request ID: pI1hxDpmS6Hp8H3kErwjiIp6rUEHQS0R01V+URHqXLlkT3k3QmcZJV+I8yYFvzsisDVICVLaA68=
    at hudson.plugins.s3.S3Profile.upload(S3Profile.java:140)
    at hudson.plugins.s3.S3BucketPublisher.perform(S3BucketPublisher.java:174)
    at hudson.tasks.BuildStepMonitor$2.perform(BuildStepMonitor.java:32)
    at hudson.model.AbstractBuild$AbstractBuildExecution.perform(AbstractBuild.java:745)
    at hudson.model.AbstractBuild$AbstractBuildExecution.performAllBuildSteps(AbstractBuild.java:709)
    at hudson.model.Build$BuildExecution.post2(Build.java:182)
    at hudson.model.AbstractBuild$AbstractBuildExecution.post(AbstractBuild.java:658)
    at hudson.model.Run.execute(Run.java:1729)
    at hudson.model.FreeStyleBuild.run(FreeStyleBuild.java:43)
    at hudson.model.ResourceController.execute(ResourceController.java:88)
    at hudson.model.Executor.run(Executor.java:231)
    Build step 'Publish artifacts to S3 Bucket' changed build result to UNSTABLE

    Thanks,

    Neil

  11. Dec 05

    tle_strut says:

    It seems that the plugin isn't saving the Proxy Host and Port configuration in t...

    It seems that the plugin isn't saving the Proxy Host and Port configuration in the 'Amazon S3 profile'.

    Anyone else else had this problem or have a workaround inplace?

    Thanks!

    1. Feb 10

      deepthip04 says:

      Any luck on proxy host and port configuration wiping out issue? Thanks!

      Any luck on proxy host and port configuration wiping out issue?

      Thanks!