Hello Community
We are doing a pilot by pushing around 50t of data over to MCSS. We have noticed that when the jobs are progressing, there are multiple streams of different sizes at the beginning of the job and as the job progresses the number of streams go down. Is there a way to ensure that the data that is being pushed to MCSS is distributed among multiple stream equally so that we get consistent performance out of the job till the end?
Best answer by Jordan
View original