The robotstxt plugin dynamically manages the robots.txt file give to robots that try to read said file. It will include paths that robots should not check because they are going to be forbidden to access those pages. For example, all the pages under the "/admin" path are forbidden unless you have a Snap! Websites administrator user.
Various plugins may add forbidden paths or other information. For example, the sitemapxml plugin puts the path to the sitemap.xml file that robots can download to quickly find all (or at least most) of the public pages available on the website.
There are currently no settings available, although we may later add a way for you to add forbidden paths.