By default, all staging sites created at Pressable have a hidden robots.txt file applied to them which prevents search engine indexing. This is generally a good thing, as you would not want clones of your live site being included in search results. Once a live domain has been added to the site, the robots.txt file changes and then allows indexing.
If you need to override this functionality for some reason, you can do so by uploading your own custom robots.txt file to the root of your site via SFTP. When a custom robots.txt file exists, it takes precedence over the system-side one.