That depends on how the databases are architected and tiered.
If they're proper slaves / replications of one another, then yes.
If, as is commonly the case especially for marketing data, periodic cuts or dumps of the data are made at various points in time, and there's no mechanism for propagating deletions throughout the chain, then no, you're not assured of deletion. This isn't likely to be the case for a site's primary database, but could very well be the case for derived datasets. I can think of instances with, say, credit bureau reports in which erroneous data must be repeatedly deleted because it keeps getting re-injected into the system.
If they're proper slaves / replications of one another, then yes.
If, as is commonly the case especially for marketing data, periodic cuts or dumps of the data are made at various points in time, and there's no mechanism for propagating deletions throughout the chain, then no, you're not assured of deletion. This isn't likely to be the case for a site's primary database, but could very well be the case for derived datasets. I can think of instances with, say, credit bureau reports in which erroneous data must be repeatedly deleted because it keeps getting re-injected into the system.
Facebook's September, 2010 outage in which cached data were being re-injected into the system exhibited a similar problem of cache coherence. http://www.facebook.com/note.php?note_id=431441338919