S3compare.io
Compare S3 storage pricing across major providers. Our detailed S3 comparison includes pricing, features, and performance ratings to help you choose the best cloud storage solution.
S3 Pricing Calculator
S3 Performance Benchmarks
S3 Pricing And Feature Comparison
Provider | Product | $/TB/Mo | Best for | Price Rating | Perf Rating | Fair Use | In/TB | Out/TB | 1Multiplier | $/1K Write | $/1K Read | GET 100MB (16) | PUT 100MB (16) | GET 4000MB (1) | GET Ops (16) | PUT Ops (16) | LIST Ops (16) | DEL Ops (16) | GET Ops (150) | PUT Ops (150) | LIST Ops (150) | DEL Ops (150) | Regions | Term | SLA | Steps |
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
1 Varies by region. We are assuming a default region in US-West (or the providers default region). |
Additional Information
What is S3 Storage?
S3 (Simple Storage Service) is a type of object storage originally developed by Amazon Web Services (AWS). It's designed to store and retrieve any amount of data from anywhere on the web. S3 storage is widely used for:
- Backup and disaster recovery
- Data archiving
- Web hosting for static websites
- Mobile and gaming applications
- Big data analytics
- Content distribution
- Software delivery
Many cloud providers now offer S3-compatible storage services, allowing for greater flexibility and choice in the market.
What are the benefits of S3 Storage?
S3 storage offers several key benefits:
- Optional encryption: Data can be encrypted at rest, providing an additional layer of security for sensitive information.
- Scalability: S3 can store virtually unlimited amounts of data, scaling seamlessly as your needs grow.
- HTTPS (encrypted transport): Data transfer to and from S3 storage can be encrypted using HTTPS, ensuring secure transmission over the internet.
- Many S3 services are designed for extremely high durability, with some providers aiming for up to 99.999999999% (11 9's). This minimizes the risk of data loss, though exact figures may vary by provider.
- Availability: Most S3 services offer high availability, typically 99.9%, some even higher.
- Versatility: Can be used for a wide range of applications and integrates well with many services and tools.
- Data management: Offers features like lifecycle policies, versioning, and access controls for better data management.
These benefits make S3 storage an attractive option for businesses of all sizes, from startups to large enterprises, looking for reliable, scalable, and secure cloud storage solutions.
About This S3 Comparison
Our S3 pricing comparison uses a 5-star rating system calculated automatically based on the relative pricing of each provider. The rating considers several factors weighted to provide an objective value assessment.
- 5-star scale (5 is best, 1 is worst)
- Factors considered:
- Monthly storage cost ($/TB/Mo) - 40% weight
- Outbound transfer cost (Out/TB) - 40% weight
- Write operation cost ($/1K Write) - 10% weight
- Read operation cost ($/1K Read) - 10% weight
- Each factor is rated on a scale of 0-5 relative to other providers:
- Lower costs receive higher ratings
- Scores are normalized against the highest cost in each category
- Fair Use penalty:
- Subtract 1 star for providers with strict fair use policies
- Final rating rounded to the nearest half-star
Some providers offer fixed free egress independent of storage size from as low as 75GB to 1TB / month. We ignored those in our calculations and only consider multipliers.
The data is provided "as is" and was verified as of 2025-03. For the most up-to-date S3 pricing, please check the official websites of the providers listed. You can reach out or create a pull-request at our GitHub repo s3compare.io_data to supply updated data or additional providers anytime.
Understanding Storage Types
In our comparison, we've categorized storage offerings into general use cases to help you choose the most appropriate option:
- General Purpose: These storage types are optimized for:
- High request/operations performance
- Frequent read/write operations
- Low latency access times, many regions to choose from
- High throughput rates
- Versatility across a wide range of use cases
- Optimized for demanding use-cases requiring consistent, high-performance access, which may be more expensive
- Backup/Archive/Media Serving: These storage types typically offer:
- Lower request rates
- Larger object sizes
- High throughput rates
- Variable latency, depending on the specific service
Automatic Categorization Method
We use an algorithm to automatically categorize providers based on their performance metrics, particularly their write operations capacity:
- Providers with combined write operations (PUT Ops 16 + PUT Ops 150) greater than or equal to 1,500 ops/second are classified as General Purpose
- Providers with combined write operations below 1,500 ops/second are classified as Backup/Archive/Media Serving
This 1,500 ops/second threshold reflects real-world requirements for high-scale production environments. Applications requiring high write throughput—like Grafana Loki log storage, OLTP database backups, CI/CD artifact repositories, and stream processing pipelines—typically demand General Purpose storage optimized for transaction-heavy workloads. Conversely, backup and archival use cases typically prioritize cost efficiency for less frequent, larger transfers. Note that major cloud providers often offer multiple tiered storage classes within their ecosystems to address various performance and cost profiles. Providers without benchmarked performance data retain their original classification pending metrics.
Fair Use Policies
Some S3 storage providers implement "Fair Use" policies, which can significantly impact the actual cost and usability of their services. These policies often:
- Set limits on data transfer, throughput or API requests that may not be immediately apparent from advertised pricing
- Can change unexpectedly, potentially affecting your service costs and performance
- May result in additional charges or reduced performance if exceeded
- Are sometimes used as marketing tactics to advertise lower initial prices
In our comparison table, providers with fair use policies are marked in red. We also apply a penalty in our rating system for providers with strict fair use policies. When choosing a provider, carefully consider the implications of these policies on your specific use case and expected usage patterns.
Performance Benchmarks Methodology
Our performance benchmarks provide objective measurements of S3 providers' performance capabilities across different operations and workloads. These benchmarks help you determine which provider best matches your specific performance requirements.
Benchmarking Environment
- All benchmarks were conducted using warp, a specialized S3 benchmarking tool, with a Python wrapper for standardization
- Benchmark client specifications:
- 8 VM CPU cores (AMD EPYC 9554)
- 32GB RAM
- 25GigE network connection
- <5ms network latency to each S3 endpoint
- Tests were run with standardized parameters to ensure fair comparison across providers
- CPU usage was monitored to be < 80%
- Each test was repeated 3 times with different concurrency levels (16 and 150) to measure scaling capabilities
Performance Metrics Explained
Metric | Operation | Concurrency | Object Size | Result Unit | Description |
---|---|---|---|---|---|
GET Throughput (100MB) | GET | 16 | 100 MiB | Mbps | Measures download bandwidth with multiple concurrent requests and moderate-sized objects |
PUT Throughput (100MB) | PUT | 16 | 100 MiB | Mbps | Measures upload bandwidth with multiple concurrent requests and moderate-sized objects |
GET Throughput (4000MB) | GET | 1 | 4000 MiB | Mbps | Measures maximum single-stream download bandwidth with large objects |
GET Operations (16) | GET | 16 | 1 KiB | ops/s | Measures small object retrieval performance under moderate concurrent load |
PUT Operations (16) | PUT | 16 | 1 KiB | ops/s | Measures small object upload performance under moderate concurrent load |
LIST Operations (16) | LIST | 16 | 1 KiB | ops/s | Measures bucket listing performance with moderate concurrent requests |
DELETE Operations (16) | DELETE | 16 | 1 KiB | ops/s | Measures object deletion performance with moderate concurrent requests |
GET Operations (150) | GET | 150 | 1 KiB | ops/s | Measures small object retrieval performance under high concurrent load |
PUT Operations (150) | PUT | 150 | 1 KiB | ops/s | Measures small object upload performance under high concurrent load |
LIST Operations (150) | LIST | 150 | 1 KiB | ops/s | Measures bucket listing performance with high concurrent requests |
DELETE Operations (150) | DELETE | 150 | 1 KiB | ops/s | Measures object deletion performance with high concurrent requests |
Performance Rating Calculation
We calculate the performance rating using a 5-star system that considers both throughput and operations metrics across different concurrency levels:
- Throughput metrics (50% of the rating):
- GET throughput for 100MB objects (weight: 25%)
- PUT throughput for 100MB objects (weight: 15%)
- GET throughput for 4000MB objects (weight: 10%)
- Operations metrics (50% of the rating):
- GET operations per second (weight: 20%, averaged across 16 and 150 concurrency)
- PUT operations per second (weight: 20%, averaged across 16 and 150 concurrency)
- LIST operations per second (weight: 5%, averaged across 16 and 150 concurrency)
- DELETE operations per second (weight: 5%, averaged across 16 and 150 concurrency)
- Providers are scored relative to the best performer in each category
- Final ratings are rounded to the nearest half-star
It's important to note that performance requirements vary significantly based on your specific use case. For general-purpose applications, a balanced performance across all metrics is ideal. For backup/archive systems, throughput may be more important than operations performance. For highly transactional applications, operations performance may be critical.
The two different concurrency levels (16 and 150) help evaluate how well providers scale with increased load. While moderate concurrency (16) tests provide a baseline for normal usage patterns, high concurrency (150) tests reveal how systems perform under stress and identify potential bottlenecks or scaling issues.
Benchmark Commands
For transparency, here are the exact commands used for each benchmark:
Benchmark | Command |
---|---|
GET 100MB Throughput | warp get --host s3.example.org:443 --access-key abc --secret-key def --bucket benchmark --duration 5m --concurrent 16 --benchdata benchmark_results/s3.example.org_get_100MB_16 --obj.size 100MB --objects 250 --tls |
GET 4000MB Throughput | warp get --host s3.example.org:443 --access-key abc --secret-key def --bucket benchmark --duration 5m --concurrent 1 --benchdata benchmark_results/s3.example.org_get_4000MB_1 --obj.size 4000MB --objects 1 --tls |
PUT 100MB Throughput | warp put --host s3.example.org:443 --access-key abc --secret-key def --bucket benchmark --duration 5m --concurrent 16 --benchdata benchmark_results/s3.example.org_put_100MB_16 --obj.size 100MB --tls |
GET Operations (16) | warp get --host s3.example.org:443 --access-key abc --secret-key def --bucket benchmark --duration 5m --concurrent 16 --benchdata benchmark_results/s3.example.org_get_1KB_16 --obj.size 1KB --objects 100000 --tls |
PUT Operations (16) | warp put --host s3.example.org:443 --access-key abc --secret-key def --bucket benchmark --duration 5m --concurrent 16 --benchdata benchmark_results/s3.example.org_put_1KB_16 --obj.size 1KB --tls |
LIST Operations (16) | warp list --host s3.example.org:443 --access-key abc --secret-key def --bucket benchmark --duration 5m --concurrent 16 --benchdata benchmark_results/s3.example.org_list_1KB_16 --obj.size 1KB --objects 100000 --tls |
DELETE Operations (16) | warp delete --host s3.example.org:443 --access-key abc --secret-key def --bucket benchmark --duration 5m --concurrent 16 --benchdata benchmark_results/s3.example.org_delete_1KB_16 --obj.size 1KB --objects 100000 --tls |
GET Operations (150) | warp get --host s3.example.org:443 --access-key abc --secret-key def --bucket benchmark --duration 5m --concurrent 150 --benchdata benchmark_results/s3.example.org_get_1KB_150 --obj.size 1KB --objects 100000 --tls |
PUT Operations (150) | warp put --host s3.example.org:443 --access-key abc --secret-key def --bucket benchmark --duration 5m --concurrent 150 --benchdata benchmark_results/s3.example.org_put_1KB_150 --obj.size 1KB --tls |
LIST Operations (150) | warp list --host s3.example.org:443 --access-key abc --secret-key def --bucket benchmark --duration 5m --concurrent 150 --benchdata benchmark_results/s3.example.org_list_1KB_150 --obj.size 1KB --objects 100000 --tls |
DELETE Operations (150) | warp delete --host s3.example.org:443 --access-key abc --secret-key def --bucket benchmark --duration 5m --concurrent 150 --benchdata benchmark_results/s3.example.org_delete_1KB_150 --obj.size 1KB --objects 100000 --tls |
How to Choose the Right S3 Provider
Selecting the ideal S3 provider involves considering factors such as pricing, performance, features, security, and region availability. Compare providers, understand their strengths, and align them with your specific storage requirements.