Yes, it's pretty horrible to hope for the death of someone just because they believe something different from you.
I don't really hope for her death, but as I see it that
is pretty much the only way she'd stop believing what she believes (ignoring the fact that afterlives exist). The understanding of tolerance that I've developed from my aggregate life experience is that it
doesn't actually exist. "Tolerant" people aren't actually tolerant, they just don't care that people hold differing worldviews as much as "intolerant" people do. It's not so much that they
accept the other world view as valid, they still think the other person is 'wrong' in their head, they just don't care strongly enough to make a issue of it.
Additionally, as I see it, once they develop a world view it is more-or-less inviolable. Rationality, argument, etc are all basically useless for attempting to change even
minor beliefs, and there's some good evidence that it actually
makes these held beliefs stronger. However, it's not
impossible for someone's mind to change, the human brain isn't adamantite; value drift can occur and can even be encouraged/forced via some methods.
Unfortunately, these methods are either extremely slow or considered morally despicable: on the slow side, things like immersing someone in a environment encouraging or requiring different beliefs, necessitating value drift to maintain comfort or even survive. On the morally despicable side, things like OL's branding or assimilation (misc 'mind rape'), or using physical/mental trauma (e.g.
stockholm syndrome). And
none of these methods are actually things that should
cause changes in belief; this isn't changing someone's mind through rational debate, this is exploiting glitches, design flaws, or limitations in the human brain to edit someone's utility function.
This, unfortunately, means you really don't have a lot of options in this scenario. Generally the best option would just be exactly what OL and company did; use a monopoly on force to make the person in question accept a desirable solution that is, to them, a "least bad" solution. It doesn't actually change her mind, it just leaves her no choice but to do what you want, and also leaves her vulnerable to the 'slow' methods of forced value drift that most people find morally acceptable. Unfortunately, given how strong her beliefs are, that might not actually
work.
If that's the case, you could just keep using your monopoly on force to force her to continue to accept a solution
you find acceptable, but if that isn't a maintainable strategy (though it probably
is), then you're left with a choice of either killing her, condemning thousands of people to death and starvation, or using a morally despicable method of forced value drift on her like what OL did to Mammon.