GeForce FX/5 5200 or Radeon 9200 - 350,000 Polygons / Scene (at 60fps) Textured & Lit. You can get roughly double by dropping the game to 30fps, although doing this in DBP will need some decent programming to keep the feel of the game smooth.
Also if you want to use Shader, expect that count to drop quite dramatically. You'll be lucky to push 200,000 Polygons / Scene.
This said that those cards are quite low (basically the most entry level Shader 2 DirectX9.0 cards you can get).
GeForce 6200 or Radeon X300 (or newer) - 1.2million Polygons / Scene (at 60fps) Texture & Lit : 620,000 Polygons / Scene Shaded.
These are currently what are considered budget cards, but remarkably is the performance difference even up to current gen bottom end is roughly the same. You'll gain about 50-100,000 polygons per scene.
Mind you these specs for the cards are for fairly well optimised DirectX9 engines using Visual C++ on Windows XP. On Vista you can expect about 10-15% performance drop, and DBP itself also isn't exactly an optimised engine. I'd say that most of the time, you can roughly half performance for DBP (which yeah it's that BAAAD).
For those wondering where those stats came from, I recently did a very extensive test with as many cards as I was able to get my hands on running on fairly similar spec (between AGP & PCI-E) on my DirectX9 "Core" Engine. I have a full list of what each card can do from the FX-Series up to the latest DirectX 10 cards.
You think it's an impressive jump in speed for the budget cards, thing about this:
FX 5900 Ultra - 4million Poly/Scene
8800 Ultra - 26million Poly/Scene
9800 XT - 4.3million Poly/Scene
HD 2900 XTX - 31million Poly/Scene (in Crossfire 53million)
that's in 4 short years since the FX-Series was released. Certainly something to think about heh