Timeline for What is the algorithm to compute the Amazon-S3 Etag for a file larger than 5GB?
Current License: CC BY-SA 4.0
8 events
| when toggle format | what | by | license | comment | |
|---|---|---|---|---|---|
| Sep 21, 2023 at 13:09 | comment | added | olfek |
Uploading via the browser, my chunk size was exactly 17179870 bytes.
|
|
| Aug 26, 2021 at 18:41 | comment | added | Bruce Edge | Nice!! 200GB confirmed, thanks!! Double upvote if I could. | |
| Jan 4, 2021 at 3:14 | comment | added | jtbandes | This worked for me on a ~20GB file with an 8MB chunk size. I uploaded to s3 with the deep archive storage class, using aws cli 2.1.15. | |
| Aug 4, 2020 at 23:01 | comment | added | SerialEnabler | seems my chunk size was 16MB utilizing official aws cli tool, maybe they updated it? | |
| Apr 13, 2020 at 22:59 | history | edited | therealmarv | CC BY-SA 4.0 |
add case for zero length files which was calculated wrong before
|
| Sep 10, 2018 at 9:23 | history | edited | Pom12 | CC BY-SA 4.0 |
Add syntax higlightning to Python code
|
| May 6, 2017 at 10:23 | history | edited | hyperknot | CC BY-SA 3.0 |
added 44 characters in body; deleted 2 characters in body
|
| May 6, 2017 at 10:17 | history | answered | hyperknot | CC BY-SA 3.0 |