Sizing Limits of Data Tables
Data table storage is limited only by the amount of available disk space that the database has access to. Despite this singular limitation, performance is based on the number of data rows in the data table.
​Data tables should not have more than 100,000 rows when queries are desired to take only a few seconds
Anything greater than 100,000 rows (in PostgreSQL) will gradually begin to cause queries to perform slower
For datasets with more than 100,000 rows of data, an external relational database should be used for optimal performance (and connected via a Database Thing)
Users should not expect Data Tables to have similar performance as external relational databases, even for size below 100K. If your solution requires high data throughput, use an external relational database for optimal performance (and connected via a Database Thing). Although Data Tables resemble database tables, individual Data Tables are not directly backed by individual database tables and must not be confused. Reads and writes operations to the Data Tables are not optimized in same way as native database queries and will not be as fast as a relational database query.
Related Links
Was this helpful?