サクサク読めて、アプリ限定の機能も多数!
トップへ戻る
アメリカ大統領選
daniel.haxx.se
I have held back on writing anything about AI or how we (not) use AI for development in the curl factory. Now I can’t hold back anymore. Let me show you the most significant effect of AI on curl as of today – with examples. Bug Bounty Having a bug bounty means that we offer real money in rewards to hackers who report security problems. The chance of money attracts a certain amount of “luck seekers
You know I spend all my days working on curl and related matters. I also spend a lot of time thinking on the project; like how we do things and how we should do things. The security angle of this project is one of the most crucial ones and an area where I spend a lot of time and effort. Dealing with and assessing security reports, handling the verified actual security vulnerabilities and waiving o
In association with the release of curl 8.4.0, we publish a security advisory and all the details for CVE-2023-38545. This problem is the worst security problem found in curl in a long time. We set it to severity HIGH. While the advisory contains all the necessary details. I figured I would use a few additional words and expand the explanations for anyone who cares to understand how this flaw work
We cut the release cycle short and decided to ship this release now rather than later because of the heap overflow issue we found. Release presentation Numbers the 252nd release 3 changes 28 days (total: 9,336) 136 bug-fixes (total: 9,551) 216 commits (total: 31,158) 1 new public libcurl function (total: 93) 0 new curl_easy_setopt() option (total: 303) 1 new curl command line option (total: 258) 4
On August 26 I posted details here on my blog about the bogus curl issue CVE-2020-19909. Luckily, it got a lot of attention and triggered discussions widely. Maybe I helped shed light on the brittleness of this system. This was not a unique instance and it was not the first time it happened. This has been going on for years. For example, the PostgreSQL peeps got a similarly bogus CVE almost at the
This is a story consisting of several little building blocks and they occurred spread out in time and in different places. It is a story that shows with clarity how our current system with CVE Ids and lots of power given to NVD is a completely broken system. CVE-2020-19909 On August 25 2023, we got an email to the curl-library mailing list from Samuel Henrique that informed us that “someone” had r
We are back with the first release since that crazy March day when we did two releases on the same day. First 8.0.0 shipped that bumped the major version for the first time in decades. Then curl 8.0.1 followed just hours after, due to a serious mess-up in the factory lines. Release video presentation Numbers the 217th release 3 changes 58 days (total: 9,189) 185 bug-fixes (total: 9,006) 322 commit
First: performance is tricky and bechmarking even more so. I will talk some numbers in this post but of course they are what I have measured for my specific tests on my machine. Your numbers for your test cases will be different. Over the last six months or so, curl has undergone a number of refactors and architectural cleanups. The primary motivations for this have been to improve the HTTP/3 supp
Let me tell you a story about how Windows users are deleting files from their installation and as a consequence end up in tears. Background The real and actual curl tool has been shipped as part of Windows 10 and Windows 11 for many years already. It is called curl.exe and is located in the System32 directory. Microsoft ships this bundled with its Operating system. They get the code from the curl
Exactly one month since the previous release, we are happy to give you curl 8.0.0 released on curl’s official 25th birthday. This a major version number bump but without any ground-breaking changes or fireworks. We decided it was about time to reset the minor number down to more a manageable level and doing it exactly on curl’s 25th birthday made it extra fun. There is no API nor ABI break in this
Time flies when you are having fun. Today is curl‘s 25th birthday. The curl project started out very humbly as a small renamed URL transfer tool that almost nobody knew about for the first few years. It scratched a personal itch of mine, Me back then I made that first curl release and I’ve packaged every single release since. The day I did that first curl release I was 27 years old and I worked as
I occasionally do talks about curl. In these talks I often include a few slides that say something abut curl’s coverage and presence on different platforms. Mostly to boast of course, but also to help explain to the audience how curl has manged to reach its ten billion installations. This is current incarnation of those seven slides in November 2022. I am of course also eager to get your feedback
tldr: we stick to C89 for now. The curl project builds on foundations that started in late 1996 with the tool named httpget. ANSI C became known as C89 In 1996 there were not too many good alternatives for making a small and efficient command line tool for doing Internet transfers. I am not saying that C was the only available language, but for me the choice was easy and frankly I did not even thi
http://http://http://@http://http://?http://#http:// The other day I sent out this tweet As it took off, got an amazing attention and I received many different comments and replies, I felt a need to elaborate a little. To add some meat to this. Is this string really a legitimate URL? What is a URL? How is it parsed? http://http://http://@http://http://?http://#http:// curl Let’s start with curl. I
On Friday January 21, 2022 I received this email. I tweeted about it and it took off like crazy. The email comes from a fortune-500 multi-billion dollar company that apparently might be using a product that contains my code, or maybe they have customers who do. Who knows? My guess is that they do this for some compliance reasons and they “forgot” that their open source components are not automatic
There’s been another 56 day release cycle and here’s another curl release to chew on! Release presentation Numbers the 197th release 6 changes 56 days (total: 8,357) 113 bug fixes (total: 6,682) 268 commits (total: 26,752) 0 new public libcurl function (total: 85) 1 new curl_easy_setopt() option (total: 285) 2 new curl command line option (total: 237) 58 contributors, 30 new (total: 2,322) 31 auth
curl’s official birthday was March 20, 1998. That was the day the first ever tarball was made available that could build a tool named curl. I put it together and I called it curl 4.0 since I kept the version numbering from the previous names I had used for the tool. Or rather, I bumped it up from 3.12 which was the last version I used under the previous name: urlget. Of course curl wasn’t created
I spent a lot of time and effort digging up the numbers and facts for this post! Lots of people keep referring to the awesome summary put together by a friendly pseudonymous “Tim” which says that “53 out of 95” (55.7%) security flaws in curl could’ve been prevented if curl had been written in Rust. This is usually in regards to discussions around how insecure C is and what to do about it. I’ve blo
tldr: work has started to make Hyper work as a backend in curl for HTTP. curl and its data transfer core, libcurl, is all written in C. The language C is known and infamous for not being memory safe and for being easy to mess up and as a result accidentally cause security problems. At the same time, C compilers are very widely used and available and you can compile C programs for virtually every o
This is not a command line option of the week post, but I feel a need to tell you a little about our brand new addition! --write-out [format] This option takes a format string in which there are a number of different “variables” available that let’s a user output information from the previous transfer. For example, you can get the HTTP response code from a transfer like this: curl -w 'code: %{resp
In the afternoon of August 5 2019, I successfully made curl request a document over HTTP/3, retrieve it and then exit cleanly again. (It got a 404 response code, two HTTP headers and 10 bytes of content so the actual response was certainly less thrilling to me than the fact that it actually delivered that response over HTTP version 3 over QUIC.) The components necessary for this to work, if you wa
Not the entire thing, just “a subset”. It’s not stated very clearly exactly what that subset is but the easy interface is mentioned in the Chrome bug about this project. What? The Chromium bug states that they will create a library of their own (named libcrurl) that will offer (parts of) the libcurl API and be implemented using Cronet. Cronet is the networking stack of Chromium put into a library
HTTP/3 explained is a collaborative effort to document the HTTP/3 and the QUIC protocols. Join in and help! Get the Web or PDF versions on http3-explained.haxx.se. The contents get updated automatically on every commit to this git repository.
It’s been five great years, but now it is time for me to move on and try something else. During these five years I’ve met and interacted with a large number of awesome people at Mozilla, lots of new friends! I got the chance to work from home and yet work with a global team on a widely used product, all done with open source. I have worked on internet protocols during work-hours (in addition to my
The protocol that’s been called HTTP-over-QUIC for quite some time has now changed name and will officially become HTTP/3. This was triggered by this original suggestion by Mark Nottingham. The QUIC Working Group in the IETF works on creating the QUIC transport protocol. QUIC is a TCP replacement done over UDP. Originally, QUIC was started as an effort by Google and then more of a “HTTP/2-encrypte
libcurl has done internet transfers specified as URLs for a long time, but the URLs you’d tell libcurl to use would always just get parsed and used internally. Applications that pass in URLs to libcurl would of course still very often need to parse URLs, create URLs or otherwise handle them, but libcurl has not been helping with that. At the same time, the under-specification of URLs has led to a
your case is still going through administrative processing and we don’t know when that process will be completed. Last year I was denied to go to the US when I was about to travel to San Francisco. Me and my employer’s legal team never got answers as to why this happened so I’ve personally tried to convince myself it was all because of some human screw-up. Because why would they suddenly block me?
DNS over HTTPS (DOH) is a feature where a client shortcuts the standard native resolver and instead asks a dedicated DOH server to resolve names. Compared to regular unprotected DNS lookups done over UDP or TCP, DOH increases privacy, security and sometimes even performance. It also makes it easy to use a name server of your choice for a particular application instead of the one configured globall
I arrived at the Technical Museum in Stockholm together with my two kids just a short while before 17:30. A fresh, cool and clear autumn evening. For this occasion I had purchased myself a brand new suit as I hadn’t gotten one since almost twenty years before this and it had been almost that long since I last wore it. I went for a slightly less conservative purple colored shirt with the dark suit.
次のページ
このページを最初にブックマークしてみませんか?
『Daniel Stenberg - daniel.haxx.se』の新着エントリーを見る
j次のブックマーク
k前のブックマーク
lあとで読む
eコメント一覧を開く
oページを開く