This WILL definitely affect your SEO/search ranking and will cause pages to drop from the index so please use with care
You can block requests based on the user-agent string if you have the iis rewrite module installed (if not go here)
And then add a rule to your webconfig like this:
<system.webServer>
<rules>
<rule name="Request Blocking Rule" stopProcessing="true">
<match url=".*" />
<conditions>
<add input="{HTTP_USER_AGENT}" pattern="msnbot|BingBot" />
</conditions>
<action type="CustomResponse" statusCode="403" statusReason="Forbidden: Access is denied." statusDescription="You do not have permission to view this page." />
</rule>
</rules>
</system.webServer>
This will return a 403 if the bot hits your site.
UPDATE
Looking at your robots.txt i think it should be:
# robots.txt
User-agent: *
Disallow:
Disallow: *.axd
Disallow: /cgi-bin/
Disallow: /member
User-agent: bingbot
Disallow: /
User-agent: ia_archiver
Disallow: /