Here are a few things you can do to prevent crawling/scraping
- You can do some basic HTTP header validations.
- You can use some 3rd part tools
- You can use JS rendered/dynamic content, which can add a layer of difficulty
- You can user things like logins and restrict access to certain areas
- -You can use robots.txt file to control search crawlers
- -You can also decorate your hyperlinks with the rel="_nofollow" attribute
For SQL injection protection
- -You can try to have levels of extraction from your DB(n-tiered applicatin) where the actual web application will not directly interact with the DB.
- Properly sanitize, encode and handle all user input
- Do no rely on your own validation and sanitation, use the tools that
have put together by dev teams
- Use unit testing in your application, make sure your application can
handle all types of input, and fails safe
- Ensure you are not throwing verbose error messages directly from the
database