AmazonAWSS3操作⼿册
Install the SDK
The recommended way to use the AWS SDK for Java in your project is to consume it from Maven. Import the and specify the SDK Maven modules that your project needs in the dependencies.
Importing the BOM
<dependencyManagement>
<dependencies>
<dependency>
<groupId>com.amazonaws</groupId>
<artifactId>aws-java-sdk-bom</artifactId>
<version>1.11.63</version>
<type>pom</type>
<scope>import</scope>
</dependency>
</dependencies>
</dependencyManagement>
Using the SDK Maven modules
<dependencies>
<dependency>
<groupId>com.amazonaws</groupId>
<artifactId>aws-java-sdk-ec2</artifactId>
</dependency>
<dependency>
<groupId>com.amazonaws</groupId>
<artifactId>aws-java-sdk-s3</artifactId>
</dependency>
<dependency>
<groupId>com.amazonaws</groupId>
<artifactId>aws-java-sdk-dynamodb</artifactId>
</dependency>
</dependencies>
See the section of the developer guide for more information about installing the SDK through other means.
Features
Provides easy-to-use HTTP clients for all supported AWS services, regions, and authentication protocols.
Client-Side Data Encryption for Amazon S3 - Helps improve the security of storing application data in Amazon S3.
Amazon DynamoDB Object Mapper - Uses Plain Old Java Object (POJOs) to store and retrieve Amazon DynamoDB data.
Amazon S3 Transfer Manager - With a simple API, achieve enhanced the throughput, performance, and reliability by using multi-threaded Amazon S3 multipart calls.
Amazon SQS Client-Side Buffering - Collect and send SQS requests in asynchronous batches, improving application and network performance.
Automatically uses on configured Amazon EC2 instances.
And more!
Building From Source
Once you check out the code from GitHub, you can build it using Maven. To disable the GPG-signing in the build, use:
mvn clean install -Dgpg.skip=true
Supported Versions
1.11.x - Recommended.
1.10.x - Approved. Only major critical bugs will be fixed. To get the new features, upgrade to 1.11.x version of the SDK.
/*
* Copyright 2010-2016 Amazon, Inc. or its affiliates. All Rights Reserved.
*
* Licensed under the Apache License, Version 2.0 (the "License").
* You may not use this file except in compliance with the License.
* A copy of the License is located at
*
* aws.amazon/apache2.0
*
* or in the "license" file accompanying this file. This file is distributed
* on an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either
* express or implied. See the License for the specific language governing
* permissions and limitations under the License.
*/
import java.io.BufferedReader;
import java.io.File;
import java.io.FileOutputStream;
import java.io.IOException;
import java.io.InputStream;
import java.io.InputStreamReader;
import java.io.OutputStreamWriter;
import java.io.Writer;
import java.util.UUID;
import com.amazonaws.AmazonClientException;
import com.amazonaws.AmazonServiceException;
import com.amazonaws.auth.AWSCredentials;
import com.amazonaws.auth.profile.ProfileCredentialsProvider;
import ions.Region;
import ions.Regions;
import com.amazonaws.services.s3.AmazonS3;
import com.amazonaws.services.s3.AmazonS3Client;
import com.amazonaws.del.Bucket;
import com.amazonaws.del.GetObjectRequest;
import com.amazonaws.del.ListObjectsRequest;
import com.amazonaws.del.ObjectListing;
import com.amazonaws.del.PutObjectRequest;
import com.amazonaws.del.S3Object;
import com.amazonaws.del.S3ObjectSummary;
import com.amazonaws.del.S3ObjectSummary;
/**
* This sample demonstrates how to make basic requests to Amazon S3 using the
* AWS SDK for Java.
* <p>
* <b>Prerequisites:</b> You must have a valid Amazon Web Services developer
* account, and be signed up to use Amazon S3. For more information on Amazon
* S3, see aws.amazon/s3.
* <p>
* Fill in your AWS access credentials in the provided credentials file
* template, and be sure to move the file to the default location
* (~/.aws/credentials) where the sample code will load the credentials from.
* <p>
* <b>WARNING:</b> To avoid accidental leakage of your credentials, DO NOT keep
* the credentials file in your source directory.
*
* aws.amazon/security-credentials
*/
public class S3Sample {
public static void main(String[] args) throws IOException {
/*
* The ProfileCredentialsProvider will return your [default]
* credential profile by reading from the credentials file located at
* (~/.aws/credentials).
*/
AWSCredentials credentials = null;
try {
credentials = new ProfileCredentialsProvider().getCredentials();
} catch (Exception e) {
throw new AmazonClientException(
"Cannot load the credentials from the credential profiles file. " +
"Please make sure that your credentials file is at the correct " +
"location (~/.aws/credentials), and is in valid format.",
e);
}
AmazonS3 s3 = new AmazonS3Client(credentials);
Region usWest2 = Region(Regions.US_WEST_2);
s3.setRegion(usWest2);
String bucketName = "my-first-s3-bucket-" + UUID.randomUUID();
String key = "MyObjectKey";//key可以以⽬录的形式出现a/b,则会在a⽬录下创建b⽂件
System.out.println("===========================================");
System.out.println("Getting Started with Amazon S3");
System.out.println("===========================================\n");
try {
/*
* Create a new S3 bucket - Amazon S3 bucket names are globally unique,
* so once a bucket name has been taken by any user, you can't create
* another bucket with that same name.
*
* You can optionally specify a location for your bucket if you want to
* keep your data closer to your applications or users.
*/
System.out.println("Creating bucket " + bucketName + "\n");
/*
* List the buckets in your account
*/
System.out.println("Listing buckets");
for (Bucket bucket : s3.listBuckets()) {
System.out.println(" - " + Name());
}
System.out.println();
/
*
* Upload an object to your bucket - You can easily upload a file to
* S3, or upload directly an InputStream if you know the length of
* the data in the stream. You can also specify your own metadata
* when uploading to S3, which allows you set a variety of options
* like content-type and content-encoding, plus additional metadata
* specific to your applications.
*/
System.out.println("Uploading a new object to S3 from a file\n");
s3.putObject(new PutObjectRequest(bucketName, key, createSampleFile()));
/*
* Download an object - When you download an object, you get all of
* the object's metadata and a stream from which to read the contents.
* It's important to read the contents of the stream as quickly as
* possibly since the data is streamed directly from Amazon S3 and your
object to* network connection will remain open until you read all the data or
* close the input stream.
*
* GetObjectRequest also supports several other options, including
* conditional downloading of objects based on modification times,
* ETags, and selectively downloading a range of an object.
*/
System.out.println("Downloading an object");
S3Object object = s3.getObject(new GetObjectRequest(bucketName, key));
System.out.println("Content-Type: " + ObjectMetadata().getContentType()); ObjectContent());
/*
* List objects in your bucket by prefix - There are many options for
* listing the objects in your bucket. Keep in mind that buckets with
* many objects might truncate their results when listing their objects,
* so be sure to check if the returned object listing is truncated, and
* use the AmazonS3.listNextBatchOfObjects(...) operation to retrieve
* additional results.
*/
System.out.println("Listing objects");
ObjectListing objectListing = s3.listObjects(new ListObjectsRequest()
.withBucketName(bucketName)
.withPrefix("My"));
for (S3ObjectSummary objectSummary : ObjectSummaries()) {
System.out.println(" - " + Key() + " " +
"(size = " + Size() + ")");
}
System.out.println();
/*
* Delete an object - Unless versioning has been turned on for your bucket,
* there is no way to undelete an object, so use caution when deleting objects.
*/
System.out.println("Deleting an object\n");
s3.deleteObject(bucketName, key);
/*
* Delete a bucket - A bucket must be completely empty before it can be
* deleted, so remember to delete any objects from your buckets before
* you try to delete them.
*/
System.out.println("Deleting bucket " + bucketName + "\n");
s3.deleteBucket(bucketName);//⾮空bucketName不能删除
} catch (AmazonServiceException ase) {
System.out.println("Caught an AmazonServiceException, which means your request made it " + "to Amazon S3, but was rejected with an error response for some reason.");
System.out.println("Error Message: " + Message());
System.out.println("HTTP Status Code: " + StatusCode());
System.out.println("AWS Error Code: " + ErrorCode());
System.out.println("Error Type: " + ErrorType());
System.out.println("Request ID: " + RequestId());
} catch (AmazonClientException ace) {
System.out.println("Caught an AmazonClientException, which means the client encountered " + "a serious internal problem while trying to communicate with S3, "
+ "such as not being able to access the network.");
System.out.println("Error Message: " + Message());
}
}
/**
* Creates a temporary file with text data to demonstrate uploading a file
* to Amazon S3
*
* @return A newly created temporary file with text data.
*
* @throws IOException
*/
private static File createSampleFile() throws IOException {
File file = ateTempFile("aws-java-sdk-", ".txt");
file.deleteOnExit();
Writer writer = new OutputStreamWriter(new FileOutputStream(file));
writer.write("abcdefghijklmnopqrstuvwxyz\n");
writer.write("01234567890112345678901234\n");
writer.write("!@#$%^&*()-=[]{};':',.<>/?\n");
writer.write("01234567890112345678901234\n");
writer.write("abcdefghijklmnopqrstuvwxyz\n");
writer.close();
return file;
}
/**
* Displays the contents of the specified input stream as text.
*
* @param input
* The input stream to display as text.
*
* @throws IOException
*/
private static void displayTextInputStream(InputStream input) throws IOException {
BufferedReader reader = new BufferedReader(new InputStreamReader(input));
while (true) {
String line = adLine();
if (line == null) break;
System.out.println(" " + line);
}
System.out.println();
}
}
版权声明:本站内容均来自互联网,仅供演示用,请勿用于商业和其他非法用途。如果侵犯了您的权益请与我们联系QQ:729038198,我们将在24小时内删除。
发表评论