From Uncyclopedia
Jump to navigation Jump to search

Geography is a concept where Americans think the world begins in the US and ends at their boarders. If by chance they discover a world outside their boarders the next question is can it be invaded or taken advantage of by profit driven corporations.