Robots.txt is pretty straightforward. Bots are supposed to load
http://yoursite.com/robots.txt to see if they are allowed to crawl your site.
Below is the usual setup for robots.txt. The ideas is you first allow everything, then deny specific user agents:
User-agent: * Allow: / User-agent: BadBot1 Disallow: / User-agent: BadBot2 Disallow: /
The good bots follow this convention. Bad bots don’t.
To ban bad bots, hopefully they have a unique IP adress, UserAgent string, request method or some other identifier that you can ban in .htaccess using a RewriteRule.