I thought up until recently that push datasets followed the 200k first in first out (basciFIFO) retention policy. I realise now that this is just for streaming datasets as the 200k rows is stored in cache.
I am pushing into the dataset using the RESTful API but I have historical analysis enabled so that I could avail of all the different types of charts for my dashboards.
I see here that datasets are limited to 1gb. What happens when that limit is reached. Is there any FIFO system here? I know you cant selectively delete from a dataset, its either all or nothing. Would that mean that when I hit the 1gb limit, I'd have to delete out all the data and start again?