I'm trying to use suck to download a set of newsgroups off my ISPs server in the group/article format. I've been having a problem with the article IDs:

I run:
%suck *newsserver* -m -dm articles -dd data -dt temp -AL active
%lmove -d articles

the lmove-config has:
BASE=news/
ACTIVE=active

If I run it a second time, there is a problem. Suck downloads the exact SAME articles again, puts them into the articles directory, then lmove moves them, but assigns new numbers. You end up with a bunch of dupes in news/ every time you run it.

Is there any way I can stop suck from downloading things its already successfully downloaded?