Quantcast
Channel: Developer topics
Viewing all articles
Browse latest Browse all 17892

What happens when you reach the size limit of a Push Dataset with Historical Analysis Enabled

$
0
0

I thought up until recently that push datasets followed the 200k first in first out (basciFIFO) retention policy. I realise now that this is just for streaming datasets as the 200k rows is stored in cache.

 

I am pushing into the dataset using the RESTful API but I have historical analysis enabled so that I could avail of all the different types of charts for my dashboards. 

 

I see here that datasets are limited to 1gb. What happens when that limit is reached. Is there any FIFO system here? I know you cant selectively delete from a dataset, its either all or nothing. Would that mean that when I hit the 1gb limit, I'd have to delete out all the data and start again?


Viewing all articles
Browse latest Browse all 17892

Trending Articles



<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>