やりたいこと VPCのprivate networkの中にあるRDS(postgres)のデータを、 S3に直接exportしたい。 やりかた export対象のRDSにログインし、以下のSQLを実行すると、 RDSのデータをS3へ直接exportできます。便利。 select * from aws_s3.query_export_to_s3( 'select * from public.user limit 10', aws_commons.create_s3_uri( 'test', -- bucket名 'rds_export_test', -- object名 'ap-northeast-1' -- region名 ) ); しかし、コマンドを入力するだけではできません。 実現するためには(地味に面倒な)以下の準備が必要です。 RDSからS3へ疎通できるように、ネットワークの設定を
Disclaimer: The opinions expressed here are my own and do not necessarily represent those of current or past employers. Recent CommentsJames Hamilton on Seagate HAMRTom Davies on Seagate HAMRMatt on Seagate HAMRJames Hamilton on Seagate HAMRMatt on Seagate HAMRJames Hamilton on Cost of Power in Large-Scale Data CentersChris K. on Cost of Power in Large-Scale Data CentersJames Hamilton on Cost of P
すべての Microsoft 製品 Global Microsoft 365 Teams Copilot Windows Surface Xbox セール 法人向け サポート ソフトウェア Windows アプリ AI OneDrive Outlook Skype OneNote Microsoft Teams PC とデバイス Xbox を購入する アクセサリ VR & 複合現実 エンタメ Xbox Game Pass Ultimate Xbox とゲーム PC ゲーム Windows ゲーム 映画とテレビ番組 法人向け Microsoft Cloud Microsoft Security Azure Dynamics 365 一般法人向け Microsoft 365 Microsoft Industry Microsoft Power Platform Windows 365 開発者
Main navigation Solutions Column 1 Business challenge Software renewals and audits Software license management and optimization SaaS spend management Cloud cost management IT asset lifecycle management CMDB data quality Accurate IT inventory Security and regulatory risk management Sustainable IT AI-powered transformation Public sector Column 2 Spend management by vendor IBM Oracle Microsoft SAP VM
Topics Application Readiness FinOps and Cloud Optimization IT Asset Management IT Visibility Software Vulnerability Management Technology Value Optimization Last Thursday’s Amazon EC2 outage was the worst in cloud computing’s history. It made the front page of many news pages, including the New York Times, probably because many people were shocked by how many web sites and services rely on EC2. Se
Topics Application Readiness FinOps and Cloud Optimization IT Asset Management IT Visibility Software Vulnerability Management Technology Value Optimization Today I want to dive a little into the Cloud Foundry architecture and highlight how IaaS and PaaS really are complementary. I’m really hoping that more PaaS options will become available so we can offer our users a choice of PaaS software. Clo
I’d like an external private hybrid cloud, dry, with whole milk, please! Enterprises rise to the cloud, terminology takes off – as if we didn’t have enough cloud confusion already. But it’s not all bad news – some of the terms do make sense. While many of the benefits associated with the cloud are independent of cloud type – internal, external, private, public – the type of cloud does determine re
Load balancing is one of the technologies that virtually all our customers are using within EC2, and there is an increasing set of options available for doing it. We’ve been giving advice to our customers for years on what we’ve seen work but we finally decided to spend some time and do a real A-B benchmark comparison of a number of solutions. The test we ran compared the following solutions: HApr
Recently I’ve been working on a project where I’ve got millions of relatively small objects, sized between 5kb and 500kb, and they all have to be uploaded to S3. Naturally, doing a synchronous upload of each object, one by one, just doesn’t cut it. We need to upload the objects in parallel to achieve acceptable performance. But what are the optimal parameters when it comes to the number of simulta
MongoDB is drawing crowds, lately. Some even dare to call it the new MySQL. We didn't work with it yet, although we investigated its use on GeoSpatial systems already a while ago. Usabilla, our latest partner, and one of Amsterdam's hottest startups wants one. Apart from being fun, one of the reasons they 'want one' is that it promises to help them fight the monkey that wrecked serious havoc on my
何度もやったことあるのに、ブログに書いてなかったっぽいので書く。 Pythonからbotoを使ってAWSのS3にファイルをアップロードする方法。 試したPythonのバージョンは2.6。botoは2.0rc1 botoのインストール pipやeasy_installで入れればよい。 $ pip install boto S3へのバケットの用意 アップロード先のバケットを用意する。 botoのAPIで作成することもできるが、今回はAWSのコンソールから作成しておく。 S3のタブのBucketsから"Create Bucket"でバケット名を入力して作成。 今回は test-boto.nullpobug.com とした。(別にドメイン名になってなくてもよいが、CNAME割り当てでのアクセスを使うならドメイン名にする必要がある) アクセスキーの確認 AWSのAPIを利用するにはアクセスキーとシー
リリース、障害情報などのサービスのお知らせ
最新の人気エントリーの配信
処理を実行中です
j次のブックマーク
k前のブックマーク
lあとで読む
eコメント一覧を開く
oページを開く