Links with more views tend to rank better in searches. So, if a search engine finds the exact same content in two places in response to a query, it might not distinguish between them. It may rank them similarly, and therefore dilute the traffic by splitting it into two links. This will lead to your content getting fewer views and poorer rankings.
Sometimes, search engines can figure this out and group all the URLs linking to the same content into one cluster. But it doesnâ€کt always happen.
Furthermore, a thing you should understand about search engines is that they do not, as a matter of fact, search the internet the moment they receive a query. Instead, they do it beforehand, using programs called bots to crawl websites and create an index. Your search results will come from the index, rather than the internet at large. This means that duplicated content can cause a search engine to be slower in indexing a website hosting it, which also means a slower recrawling rate â€“ crucial for content which gets updated from time to time. Duplicate content, therefore, makes it less likely for your visitors to get the latest from you.
Finally, search engines may penalize a website if they perceive duplicate content as SEO spam. Simply put, this means that a search engine might delete from its index a website or page which violates their guidelines. In the specific case of Google, the most frequently used search engine in the world, this means you should avoid creating multiple pages hosting duplicate content and avoid overreliance on content scraping.