InfoQ Software Architects' Newsletter A monthly overview of things you need to know as an architect or aspiring architect. View an example
Buy now, pay later (BNPL) methods, little-known less than a decade ago, now account for more than $300 billion in transactions worldwide. However, they tend to be priced at a premium, so our users wanted to know whether the financial benefits would offset the added costs. With that in mind, we ran an experiment to help Stripe businesses assess when and how to offer BNPLs.
「これからの時代、プラットフォームとして選択するテクノロジはオープンソース以外にありえない。むしろプラットフォームテクノロジたりうるにはオープンソースであることが要求される。」─7月8日、東京・汐留で開催された「Hadoop Conference Japan 2014」の基調講演において、Hadoop生みの親として知られるClouderaのチーフアーキテクト ダグ・カッティング(Doug Cutting)氏はこう言い切りました。そしてこの言葉はオープンソース信者の妄想でも何でもなく、現在、世界中のIT関係者が共有する一般的な認識となっています。 少し前までは企業や大学の片隅に置かれたスパコンやメインフレームの独壇場だった並列分散処理システムを、安価で汎用的なハードウェア上で稼働させ、さらにサーバ台数を増やせば増やすほど処理能力を高めるというスケールアウトを実現したことで、Hadoopは
Walmart is making a change in the way it does online commerce and there’s a big data story behind that shift. Stephen O’Sullivan, senior director at of Global e-commerce, at WalmartLabs (s wmt) is prepping the retail giant to move from 10 different web sites to one and from a trial-sized 10-node Hadoop cluster to a 250-node Hadoop cluster. Along the way his team will build several tools to migrate
HIPI is an image processing library designed to be used with the Apache Hadoop MapReduce parallel programming framework. HIPI facilitates efficient and high-throughput image processing with MapReduce style parallel programs typically executed on a cluster. It provides a solution for how to store a large collection of images on the Hadoop Distributed File System (HDFS) and make them available for e
At last week's Hadoop Summit, Twitter Analytics Lead Kevin Weil announced that it would open source Crane, its data migration tool used to move MySQL data into Hadoop. Hadoop is used by Twitter to examine collected analytics as well as data crunching for live tools such as name search. Weil said that Twitter uses Scribe to log data into Hadoop and Crane to manage tabular data. Crane moves MySQL da
The guys from the Adobe SaaS team — same guys that shared with us their experience and reasons for using HBase — have ☞ open sourced their Puppet[1] recipes for automating Hadoop/HBase deployments. Right now we are open-sourcing on GitHub, Puppet recipes for: creating the user under which the entire hstack runs.changing system settings, like the ssh keys, authorizing machines to talk to each other
リリース、障害情報などのサービスのお知らせ
最新の人気エントリーの配信
処理を実行中です
j次のブックマーク
k前のブックマーク
lあとで読む
eコメント一覧を開く
oページを開く