Monday, November 7, 2016

AWS Misc



CDN - CloudFront

S3,
Glacier - store archive, not frequently used data

Amazon SimpleDB
- smaller dataeset, NO SOL, KV

Amazon RDS
- relation db, built on mysql

DynamoDB

CloudFront - CDN

http://docs.aws.amazon.com/elasticloadbalancing/latest/userguide/how-elastic-load-balancing-works.html#healthcheck
A load balancer accepts incoming traffic from clients and routes requests to its registered EC2 instances in one or more Availability Zones. The load balancer also monitors the health of its registered instances and ensures that it routes traffic only to healthy instances. When the load balancer detects an unhealthy instance, it stops routing traffic to that instance, and then resumes routing traffic to that instance when it detects that the instance is healthy again.
https://superuser.com/questions/338296/how-to-use-yum-to-reinstall-all-dependencies-of-a-given-package
rpm -qa | xargs yum reinstall
yum reinstall $(yum list installed | awk '{print $1}')
yum list installed
Had to install gcc and install mtools as root.
http://docs.aws.amazon.com/AWSEC2/latest/UserGuide/compile-software.html
Because software compilation is not a task that every Amazon EC2 instance requires, these tools are not installed by default, but they are available in a package group called "Development Tools" that is easily added to an instance with the yum groupinstall command.
yum groupinstall "Development Tools"

sudo yum clean all
sudo yum update
http://docs.aws.amazon.com/AWSEC2/latest/UserGuide/install-software.html
You can also use yum install to install RPM package files that you have downloaded from the Internet.

yum install my-package.rpm
http://cloudacademy.com/blog/aws-cli-a-beginners-guide/
aws s3 ls
~/.aws/config
aws iam list-users --output table
List all your EC2 tags:
aws ec2 describe-tags --output table
aws ec2 describe-tags --output table
Play around with outputs, help, or whatever
aws ec2 describe-spot-price-history help
aws ec2 describe-spot-price-history help
aws ec2 describe-instances
aws ec2 describe-instances

$ aws help
The following command lists the available subcommands for Amazon EC2.
$ aws ec2 help
The next example lists the detailed help for the EC2 DescribeInstances operation, including descriptions of its input parameters, filters, and output. Check the examples section of the help if you are not sure how to phrase a command.
$ aws ec2 describe-instances help
aws s3 ls s3://mybucket
aws s3 ls s3://mybucket --recursive
aws s3 ls s3://mybucket --recursive --human-readable --summarize
aws s3 ls help
http://docs.aws.amazon.com/cli/latest/userguide/using-s3-commands.html
The sync command synchronizes the contents of a bucket and a directory, or two buckets.

http://stackoverflow.com/questions/31942341/selective-file-download-in-aws-s3-cli
This command will copy all files starting with 2015-08-15:
aws s3 cp s3://BUCKET/ folder --exclude "*" --include "2015-08-15*" --recursive
If your goal is to synchronize a set of files without copying them twice, use the sync command:
aws s3 sync s3://BUCKET/ folder
That will copy all files that have been added or modified since the previous sync.
In fact, this is the equivalent of the above cp command:
aws s3 sync s3://BUCKET/ folder --exclude "*" --include "2015-08-15*"

https://aws.amazon.com/blogs/developer/amazon-s3-transfermanager/
TransferManager provides asynchronous management for uploads and downloads between your application and Amazon S3. You can easily check on the status of your transfers, add handlers to run code when a transfer completes, cancel transfers, and more.
TransferManager tx = new TransferManager(credentials); // The upload and download methods return immediately, while // TransferManager processes the transfer in the background thread pool Upload upload = tx.upload(bucketName, myFile.getName(), myFile);

Depending on the size and data source for your upload, TransferManager adjusts the algorithm it uses to process your transfer, in order to get the best performance and reliability. Whenever possible, uploads are broken up into multiple pieces, so that several pieces can be sent in parallel to provide better throughput. In addition to higher throughput, this approach also enables more robust transfers, since an I/O error in any individual piece means the SDK only needs to retransmit the one affected piece, and not the entire transfer.

TransferManager includes several more advanced features, such as recursively downloading entire sections of S3 buckets, or the ability to clean up pieces of failed multipart uploads. One of the more commonly used options is the ability to attach a progress listener to your uploads and downloads, which can run custom code at different points in the transfer’s lifecycle.

// You can set a progress listener directly on a transfer, or you can pass one into // the upload object to have it attached to the transfer as soon as it starts upload.setProgressListener(new ProgressListener() { // This method is called periodically as your transfer progresses public void progressChanged(ProgressEvent progressEvent) { System.out.println(upload.getProgress().getPercentTransferred() + "%"); if (progressEvent.getEventCode() == ProgressEvent.COMPLETED_EVENT_CODE) { System.out.println("Upload complete!!!"); } } };

upload.waitForCompletion();

http://docs.aws.amazon.com/AWSJavaSDK/latest/javadoc/com/amazonaws/services/s3/transfer/TransferManager.html
makes extensive use of Amazon S3 multipart uploads to achieve enhanced throughput, performance and reliability.
When possible, TransferManager attempts to use multiple threads to upload multiple parts of a single upload at once

TransferManager is responsible for managing resources such as connections and threads; share a single instance of TransferManager whenever possible. TransferManager, like all the client classes in the AWS SDK for Java, is thread safe. Call TransferManager.shutdownNow() to release the resources once the transfer is complete.

Transfers can be paused and resumed at a later time. It can also survive JVM crash, provided the information that is required to resume the transfer is given as input to the resume operation.

Summary of the Amazon S3 Service Disruption in the Northern Virginia (US-EAST-1) Region
https://aws.amazon.com/cn/message/41926/
http://coolshell.cn/articles/17737.html



Labels

Review (572) System Design (334) System Design - Review (198) Java (189) Coding (75) Interview-System Design (65) Interview (63) Book Notes (59) Coding - Review (59) to-do (45) Linux (43) Knowledge (39) Interview-Java (35) Knowledge - Review (32) Database (31) Design Patterns (31) Big Data (29) Product Architecture (28) MultiThread (27) Soft Skills (27) Concurrency (26) Cracking Code Interview (26) Miscs (25) Distributed (24) OOD Design (24) Google (23) Career (22) Interview - Review (21) Java - Code (21) Operating System (21) Interview Q&A (20) System Design - Practice (20) Tips (19) Algorithm (17) Company - Facebook (17) Security (17) How to Ace Interview (16) Brain Teaser (14) Linux - Shell (14) Redis (14) Testing (14) Tools (14) Code Quality (13) Search (13) Spark (13) Spring (13) Company - LinkedIn (12) How to (12) Interview-Database (12) Interview-Operating System (12) Solr (12) Architecture Principles (11) Resource (10) Amazon (9) Cache (9) Git (9) Interview - MultiThread (9) Scalability (9) Trouble Shooting (9) Web Dev (9) Architecture Model (8) Better Programmer (8) Cassandra (8) Company - Uber (8) Java67 (8) Math (8) OO Design principles (8) SOLID (8) Design (7) Interview Corner (7) JVM (7) Java Basics (7) Kafka (7) Mac (7) Machine Learning (7) NoSQL (7) C++ (6) Chrome (6) File System (6) Highscalability (6) How to Better (6) Network (6) Restful (6) CareerCup (5) Code Review (5) Hash (5) How to Interview (5) JDK Source Code (5) JavaScript (5) Leetcode (5) Must Known (5) Python (5)

Popular Posts