It's the end of the world…no really. When you pick up a dystopia book, get ready to read of catastrophic destruction and a world very different from our own.
Dystopian fiction usually paints a stark and dreary picture of a post-apocalyptic landscape where humans struggle to survive. Dystopian books represent our darkest fears and (perhaps) our secret excitement to see the fall of humanity and the descent into our more animalistic nature...
Dystopian books, for all their maladies and bleakness, can also contain seeds of hope and great heroes who find a way to achieve happiness within even the worst of times.
So, if you've ever wondered what the world would look like after the breakdown of society, pick up one of our endorsed dystopian books and enjoy the end of the world.