I receive a lot of review books, but I have never once told lies about the book just because I got a free copy of it. However, some authors seem to feel that if they send you a copy of their book for free, you should give it a positive review.
Do you think reviewers are obligated to put up a good review of a book, even if they don’t like it? Have we come to a point where reviewers *need* to put up disclaimers to (hopefully) save themselves from being harassed by unhappy authors who get negative reviews?
because I have just started blogging. Everything that I write on this subject is just be hypothetical.
That being said. I think that it is important for reviewers of books to be honest. But I also
Authors also have to realize that not everyone is going to like their book. It is the risk that
review books directly from the author (or publisher), I will be honest and state when I do not like a book. I will treat that book no different than any of the other books that I read.