You just tried to pull a dataset into Google Sheets and hit a wall. Maybe =IMPORTDATA() returned an error, or a CSV upload froze your browser tab. Every import method in Google Sheets has its own ceiling, and they are not all the same. This guide breaks down the Google Sheets import limit for every method — IMPORTDATA, IMPORTRANGE, IMPORTHTML, IMPORTXML, file uploads, and the Sheets API — so you know exactly what you can and cannot bring in.
Key Takeaways:IMPORTDATA has the tightest cap at 50,000 cells per function callGoogle Sheets overall limit is 10 million cells per spreadsheetFile uploads support up to 100 MB, but browsers crash around 50 MBSmoothSheet bypasses browser limits with server-side CSV processing
Google Sheets Overall Limits (Quick Recap)
Before diving into individual import functions, here is the big picture. Every Google Sheets spreadsheet has a hard ceiling of 10 million cells. That is rows multiplied by columns across all tabs. The maximum column count is 18,278 (column ZZZ), which means your practical row limit depends on how wide your data is.
For a deep breakdown of these limits and what happens when you hit them, see our full guide on Google Sheets file size limits. You can also use the Google Sheets Limits Calculator to check whether your specific dataset fits before importing.
With the overall ceiling clear, let us look at the limits for each import method.
IMPORT Function Limits
Google Sheets has four built-in IMPORT functions, and each one behaves differently when it comes to data caps. Here is what you need to know about each.
IMPORTDATA — 50,000 Cells
The =IMPORTDATA(url) function pulls CSV or TSV data from a public URL directly into your spreadsheet. It is the simplest way to import external data, but it has the strictest limit: 50,000 cells per function call.
There is also a file size constraint. IMPORTDATA struggles with files larger than roughly 2 MB. Exceed either limit and you will see a #N/A or Resource at url contents exceeded maximum size error.
Other things to keep in mind:
- Each spreadsheet can have a maximum of 50 IMPORTDATA calls
- The source URL must be publicly accessible (no authentication)
- Data refreshes automatically, but Google does not guarantee a specific interval (typically every 1–2 hours)
- IMPORTDATA counts toward your spreadsheet's 10M cell limit
Workaround: If your CSV has more than 50,000 cells, use File > Import or a tool like SmoothSheet to handle the upload server-side without hitting the IMPORTDATA cap.
IMPORTRANGE — Up to 10 Million Cells
The =IMPORTRANGE(spreadsheet_url, range_string) function pulls data from another Google Sheets file. Unlike IMPORTDATA, it does not have its own cell cap beyond the spreadsheet limit itself. You can import up to the full 10 million cells if the destination sheet has room.
However, there are practical limits:
- Large IMPORTRANGE calls slow the spreadsheet significantly — anything above 100,000 cells can cause lag
- You need one-time access authorization from the source spreadsheet owner
- The source sheet's data counts toward the destination spreadsheet's 10M cell limit
- There is no official cap on the number of IMPORTRANGE calls per sheet, but Google may throttle excessive use
For a complete walkthrough of syntax, errors, and optimization tips, check out our IMPORTRANGE guide.
IMPORTHTML — Varies by Page Size
The =IMPORTHTML(url, query, index) function scrapes HTML tables or lists from web pages. It does not have a published cell limit, but it is constrained by the size and complexity of the source page.
What affects IMPORTHTML performance:
- Pages larger than roughly 5 MB often cause timeouts or errors
- JavaScript-rendered content is invisible to IMPORTHTML — it only reads static HTML
- Complex nested tables may return incomplete data
- Google caches results, so you may not see live updates immediately
Tip: If IMPORTHTML fails on a large page, try IMPORTXML with a more targeted XPath query to pull only the data you need.
IMPORTXML — Varies by Response Size
The =IMPORTXML(url, xpath_query) function fetches structured data from XML or HTML pages using XPath expressions. Like IMPORTHTML, it does not have a hard cell limit but depends on the source response.
Practical constraints:
- The response payload should stay under roughly 2–5 MB for reliable results
- XPath queries that return large node sets may time out
- The URL must serve content with valid XML or HTML structure
- Maximum of 50 IMPORTXML and IMPORTHTML calls combined per spreadsheet
File Upload Import Limits
When you use File > Import in Google Sheets to upload a CSV, XLSX, or TSV file, the official maximum file size is 100 MB. That sounds generous, but the reality is more nuanced.
Here is what actually happens at different file sizes:
| File Size | What to Expect |
|---|---|
| Under 10 MB | Fast, smooth upload in most browsers |
| 10–30 MB | Slower processing, may take 30–60 seconds |
| 30–50 MB | High chance of lag, occasional browser tab crashes |
| 50–100 MB | Frequent browser crashes, especially on older machines |
The problem is not the server — Google can handle the file. The bottleneck is your browser, which has to parse and render all that data in a single tab. Chrome typically allocates about 4 GB of RAM per tab, and a 50 MB CSV with dozens of columns can eat through that quickly.
If you have hit this wall before, our guide on fixing the file too large error walks through every solution. The fastest fix is using SmoothSheet, which processes CSV and Excel imports server-side so your browser never touches the raw data. Files up to 100 MB import cleanly without crashes — for $9/month.
You can also split large files before uploading using the CSV Splitter tool.
API Import Limits
If you are importing data programmatically through the Google Sheets API, there is a separate set of quotas to watch.
Key API limits as of 2026:
- Read requests: 300 per minute per project
- Write requests: 300 per minute per project
- Per-user limit: 60 requests per minute per user
- Cells per request: Up to 10 million (limited by the spreadsheet cap)
- Request payload size: 10 MB maximum per API call
For bulk imports, the spreadsheets.values.batchUpdate method is the most efficient because you can write multiple ranges in a single request. Even so, if your dataset has millions of rows, you will need to batch your writes across multiple API calls and respect the per-minute quota.
Pro tip: Use exponential backoff when you hit rate limits. Google returns a 429 Too Many Requests error, and retrying immediately will only make it worse.
How to Import More Data Than the Limits Allow
When your dataset exceeds what a single import method can handle, you have several options.
1. Split Large Files Before Importing
If your CSV has 200,000 rows and 30 columns (6 million cells), it fits within the 10M cell limit but may be too large for a browser upload. Use the CSV Splitter to break it into chunks — say, 50,000 rows each — and import them into separate tabs.
2. Use SmoothSheet for Server-Side Imports
SmoothSheet is a Google Sheets add-on that handles large CSV and Excel imports through server-side processing. Instead of your browser parsing a 70 MB file, SmoothSheet uploads it to a server that streams the data directly into your sheet. No crashes, no memory errors, no file-size gymnastics. It supports files up to 100 MB and costs $9/month.
3. Connect to BigQuery for Massive Datasets
If your data exceeds the 10 million cell limit entirely, Google Sheets is not the right container. Use BigQuery Connected Sheets to query billions of rows directly from Sheets without importing anything. The data stays in BigQuery; Sheets acts as a front end for analysis.
4. Batch with Apps Script
For recurring imports, Google Apps Script lets you write custom import logic. You can fetch a large CSV from a URL, parse it in chunks, and write each chunk to your sheet. This avoids the IMPORTDATA 50,000-cell cap because you control the process programmatically.
Here is a simplified example:
function importLargeCSV() {
var response = UrlFetchApp.fetch("https://example.com/data.csv");
var csvData = Utilities.parseCsv(response.getContentText());
var sheet = SpreadsheetApp.getActiveSheet();
// Write in batches of 10,000 rows
var batchSize = 10000;
for (var i = 0; i < csvData.length; i += batchSize) {
var batch = csvData.slice(i, i + batchSize);
sheet.getRange(i + 1, 1, batch.length, batch[0].length).setValues(batch);
SpreadsheetApp.flush();
}
}This approach works well for datasets under the 10M cell limit that are too large for IMPORTDATA.
Frequently Asked Questions
What is the maximum number of rows IMPORTDATA can handle?
IMPORTDATA does not have a row-specific limit. Its cap is 50,000 cells per function call. So if your CSV has 10 columns, you can import up to 5,000 rows. With 5 columns, that ceiling doubles to 10,000 rows. The file size also needs to stay under approximately 2 MB.
Can I use multiple IMPORTDATA functions to get around the 50,000 cell limit?
Technically yes, but it is not practical. You would need to split your source data into separate URLs, each returning 50,000 cells or fewer. Each spreadsheet allows a maximum of 50 IMPORTDATA calls, giving a theoretical total of 2.5 million cells — but performance degrades rapidly with more than a few active IMPORTDATA functions.
Why does Google Sheets crash when I import a large CSV file?
The crash happens in your browser, not Google Sheets itself. When you use File > Import, your browser has to download, parse, and render the entire file. Files over 30–50 MB can exhaust the browser tab's memory allocation. Server-side import tools like SmoothSheet avoid this by processing the file before it reaches your browser.
What happens when I exceed Google Sheets API rate limits?
Google returns a 429 Too Many Requests HTTP error. Your script should implement exponential backoff — wait 1 second, then 2, then 4, and so on before retrying. The per-user limit resets every 60 seconds. Staying under 60 write requests per minute per user will keep you well within safe territory for most import workflows.
Conclusion
Every import method in Google Sheets has a different ceiling. IMPORTDATA caps at 50,000 cells. IMPORTRANGE can pull up to 10 million. File uploads officially support 100 MB but practically fail around 50 MB due to browser memory. And the Sheets API enforces 300 requests per minute with a 10 MB payload cap.
Knowing these numbers saves you from trial-and-error debugging. If your data fits within the 10M cell limit but your browser cannot handle the upload, SmoothSheet handles it server-side for $9/month. If your data blows past 10M cells entirely, BigQuery Connected Sheets is the way to go.
Pick the right method for your data size, and you will never stare at a frozen browser tab again.