@alexjames212 - you are correct. According to robotstxt.org, that is the correct code to allow Googlebot but exclude all other bots that obey the robots.txt file directives. (Note: not all bots pay attention to robots.txt, so it's not a "sure thing" when it comes to excluding your site from being visited by bots.)
Which begs the question, why would someone do that? "Bad" bots are extremely unlikely to obey robotx.txt to start with, so they'll go ahead and crawl your site anyway. Generally, it's only the legitimate "good" bots that will pay attention to your robots.txt file. One can get valuable traffic from "good" bots like Bing and other search engines, unless you've forbidden them to crawl your site (which is what this code will do).
Just seems to me that it's like shooting yourself in the foot...