Take a look at Google's Making AJAX Applications Crawlable. You could work on modifying swfAddress to use the #! instead of just the #. The problem with this is I believe you'll still need to create HTML snapshot pages for Google.
Another approach that works if you're using PHP/mySQL or something like it (works great for having the content searchable and indexed) is to use a FlashVar on a PHP page. This way if content showed up in a search result and the visitor clicks the link the PHP page (and they have the Flash plugin), the FlashVar/SWFObject would trigger the display of the Flash content. If there's no plugin available, they'll see the content served from the PHP/mySQL.
One thing that really helped me understand how Google sees webpages was to use the Fetch As Googlebot in WebmasterTools. I imagine if you fed it one of the hash-frag swfAddress pages it wouldn't see anything specific to that content.
A sitemap is good, but I'm not sure if it really works. What happens when Googlebot hits those urls, is there content there to index?