By default, all robots.txt for staging sites created at Pressable are “hidden” and prevent indexing by the search engine. This is generally a good thing, as you would not want clones of your live site being included in search results. Once a live domain has been added to the site, the robots.txt file changes and then allows indexing.
If you need to override this functionality for some reason, you can do so by uploading your own custom robots.txt file to the root of your site via SFTP. When a custom robots.txt file exists, it takes precedence over the system-side one.