60% of the internet is duplicate
At the Google Search Central Live event in Singapore, Google engineer Gary Illyes said that 60 percent of the internet is duplicate content. Although it is thought to mean content that copies each other because it does not give full details, the event is a bit more optimization-oriented.
When the details in the presentation are focused, it is understood that mistakes made in search engine optimization cause great problems in indexing. Duplicate internet protocols used instead of HTTPS, pages that start with www or not, URL addresses with unnecessary parameters such as sessionID, variants that do not have / or / make the database very busy. Illyes also advises participants to give up on them.
Of course, duplicate content is one of the biggest problems, but when optimization problems are added, it becomes difficult for users to directly access the information they are looking for. This is a blow to the free logic of the internet.