This is not really good practice in my opinion, but the following process has been enacted by a third party and I am in the testing phase for its use.
For the purposes of a communal events calendar managed by this third party organization, it is agreed that each day using a cron job, an XML file is compiled and extracted from their database.
In order to use this resource within Perch, I implemented the following system:
Using a second cron job, I automatically import the contents of the XML file into a Perch collection.
Every day in this XML file, some items remain in place, but new items may appear and others may disappear.
Thanks to Perch's import system, I know that I can selectively update and delete the contents of the collection compared to each daily import, but I have a problem adding new items.
For now, the most effective way I've found is to drastically empty the collection and re-import everything every day. It works pretty well, but my concern is that the collection items id are not reset with the deletion and the identifier numbering is constantly growing.
The XML file is quite heavy (about 25Mo of dayly data on average) and contains hundreds of elements, which gives rise to id numbers that quickly make you dizzy.
My question is there a way for the collection items id in my process to be reset after deletion?
Any ideas are welcomed.