Problem with big .csv of videos for deletion

Problem with big .csv of videos for deletion

Postby filthlab on Fri Nov 10, 2017 10:31 am

Hi,
pornhub has just one csv with all of theirs deleted url in it. At this moment there are +5M lines in it.
There is no any other option to use smaller feed for deleted videos.
So, I've made a set with this big feed and I'm running it once per month.
But lately when this set is running it takes all the CPU resources and actually the site is down.
Any ideas how to avoid this?
filthlab
 
Posts: 118
Joined: Tue May 30, 2017 6:49 am

Re: Problem with big .csv of videos for deletion

Postby admin on Fri Nov 10, 2017 11:31 am

First of all - check twice, I don't think there's only one file

2. make a script like

wget http://big_file.csv
tail -n500 big_file.csv > your_file.csv

that will give you last 500 lines into your_file.csv that you can add as a regular import set url
Have you done script update ?
admin
Site Admin
 
Posts: 23901
Joined: Wed Sep 10, 2008 11:43 am


Return to SmartCJ Support Forum

Who is online

Users browsing this forum: No registered users and 3 guests