All convolutions in a dense block are ReLU-activated and use batch normalization. Channel-intelligent concatenation is barely attainable if the peak and width dimensions of the information keep on being unchanged, so convolutions in a dense block are all of stride 1. Pooling levels are inserted between dense blocks for https://financefeeds.com/stock-markets-boosts-gdp-growth-but-watch-out-for-excessive-liquidity-study-finds/