I spotted an interesting comment from John Mueller of Google about URLs and reducing duplication. He said focusing on individual URLs here and there won't help you much, it is more about looking where you can "reduce duplication by a factor of 10x." This thread is on Reddit where he added that you should not focus on the "individual posts here and there" but rather look for this at scale.
He said for example "if you have 100k products and they all have 50 URLs each, changing that from 5M URLs to 500k URLs (5 URLs each) would be worth the effort." How does one product page get 50 URLs? Well, besides for tracking parameters, there can be referral parameters, added product filters, and even bugs in your code that can generate these URLs. This is where technical SEOs shine, reducing these types of duplication at scale.
John added "that's usually also a clear technical thing, not something which depends on handwavy opinions."
Making these types of changes, where you go from 5 million URLs in Google's index to 500,000 URLs in Google's index can make a huge difference for your site in Google Search. It is not like you are missing out of 4.5 million pages, because all of those pages are duplicate 50 times to the 500,000 URLs. It just makes things cleaner and more consistent for Google and it helps consolidate signals to the primary product or category page URL.
So when you find these URL duplication issues, talk to your development team about how you can just serve the canonical URL to users and Google. There are other ways to track internal referrers and there are probably glitches that can be cleaned up with these dynamically generated URLs.
Forum discussion at Reddit.