Somehow, years ago, people go the idea that it's somehow "better" to be spidered more often. The fact is, unless you have time-sensitive content and/or content that changes frequently, then it doesn't really matter how often you get spidered.
Frequency of spidering has nothing to do with how well you rank. Googlebot could visit one of your pages every 10 seconds, but the frequency of those visits won't make that page rank any higher (or lower) than the one it visits every two or three months.
The ranking algorithm takes into consideration a lot of factors (anywhere from 100 to over 200, depending on who you ask), but I'm pretty sure "how often the page is accessed by Googlebot" is not one of those criteria.
A sitemap might be useful if there's some technical impediment that's keeping the spiders from identifying links on your pages, or from retrieving the pages to which those links point. But only if you can't for whatever reason fix the technical problem. The best solution in that case is always to fix the problem; the sitemap is simply a "last resort" in those sorts of cases.
If you have deeply-buried interior pages of your site that aren't getting indexed, this is an indication the SEs don't think those pages are "important" enough to spend resources retrieving. Simply getting those pages spidered through a sitemap isn't going to help them rank well, though.
If you have interior pages that are already being spidered, but they don't rank well, a sitemap isn't going to help that much there, either. One of the XML sitemaps won't do anything for the page's inbound link weight. A "regular" sitemap (the kind that real people sometimes use) does count as one more link pointing to those pages, but it's only one link -- and one from a page that typically has a lot
of outbound links on it. So the "weight" it adds to your interior pages will likely be slight. Not usually enough to make a noticeable difference in the rankings.
The solution to both problems is to get better links pointing at those "buried" pages. It may mean fixing your site navigation to bring your pages closer to the "top." It may mean spending some time/energy on building a few deep links. If you can do both, even better.
BTW, relevance has nothing to do with spidering or indexing. Relevance is a search-term-specific ranking concept. The spiders are there simply to retrieve the code of the page and return it to the datacenter.
Googlebot makes no judgement or valuation of relevance. That's done at a much later part of the process (i.e. in the seconds between when you enter your search query and the SERP displays on your screen) by the ranking algorithm.