Page 1 of 1

Problem with big .csv of videos for deletion

Posted: Fri Nov 10, 2017 10:31 am
by filthlab
Hi,
pornhub has just one csv with all of theirs deleted url in it. At this moment there are +5M lines in it.
There is no any other option to use smaller feed for deleted videos.
So, I've made a set with this big feed and I'm running it once per month.
But lately when this set is running it takes all the CPU resources and actually the site is down.
Any ideas how to avoid this?

Re: Problem with big .csv of videos for deletion

Posted: Fri Nov 10, 2017 11:31 am
by admin
First of all - check twice, I don't think there's only one file

2. make a script like

wget http://big_file.csv
tail -n500 big_file.csv > your_file.csv

that will give you last 500 lines into your_file.csv that you can add as a regular import set url