On martedì 15 novembre 2016 18:52:01 CET, Zygo Blaxell wrote:
Like I said, millions of extents per week... 64K is an enormous dedup block size, especially if it comes with a 64K alignment constraint as well. These are the top ten duplicate block sizes from a sample of 95251 dedup ops on a medium-sized production server with 4TB of filesystem (about one machine-day of data):
Which software do you use to dedupe your data? I tried duperemove but it gets killed by the OOM killer because it triggers some kind of memory leak: https://github.com/markfasheh/duperemove/issues/163
Niccolò Belli -- To unsubscribe from this list: send the line "unsubscribe linux-btrfs" in the body of a message to majordomo@xxxxxxxxxxxxxxx More majordomo info at http://vger.kernel.org/majordomo-info.html
