Not necessarily. Remember that field strength goes as the inverse square of the distance from the transmitter.
Suppose you need a certain minimum field strength for the receivers to work, and suppose it turns out that there is also a maximum safe field strength.
With a system based on a small number of powerful transmitters to cover an area you can usually place those transmitters in somewhat out of the way places that few people get near. You will have variation in field strength as you move around within the service area, but it won't be a lot of variation.
If I'm 1000 m from the transmitter and move 1 m closer, the field gets 0.2% stronger.
With a system based on a large number of weaker transmitters those transmitters need to be closer to where the receivers are to get the same minimum field strength throughout the same area. But with people much closer to the transmitters, there will be a lot more variation in field strength as they move around, so it would be harder to make sure that no places where people commonly go stays under your maximum.
If I'm 100 m from the transmitter and move 1 m closer, the field gets 2% stronger.
If I'm 10 m from the transmitter and move 1 m closer, the field gets 23% stronger.
There are a few problems with this argument though. It's technically correct, but:
1. All mobile protocols reduce signal strength to the lowest possible to reach the device. The field strength isn't some constant quantity. 5G is all about massive beamforming and super accurate power management. You're being targeted with an individual beam that's reduced the min strength possible to actually maintain radio contact. So moving closer doesn't imply a stronger 'field'.
2. Even with 4G it's totally common to have lots of base stations in the middle of cities, on rooftops, etc. It's not like they're all in the middle of empty fields today, far from it.
This totally neglects that in many cases, your smartphone is the closest transmitter. Having many base stations reduces the output power of your smartphone necessary to create the field power at the base station.
So I guess then it would come down to how often your phone transmits. If there is some sort of potential damage from non-ionizing radiation at power levels below those that cause heating damage (which has not been demonstrated), it is probably going to be a probabilistic thing. Probability of, say, cancer will be something like the integral over time of field strength above some threshold.
With the sparse high power tower network, that integral is low when you phone is not transmitting, but spikes way up when it is. With the dense low power tower network, you are trading a higher integral when your phone is not transmitting for a smaller spike when it is.
Figuring out which is better would require a lot more information than anyone here likely has (if it even matters at all...it is not proven that there is any danger in either scenario).
Suppose you need a certain minimum field strength for the receivers to work, and suppose it turns out that there is also a maximum safe field strength.
With a system based on a small number of powerful transmitters to cover an area you can usually place those transmitters in somewhat out of the way places that few people get near. You will have variation in field strength as you move around within the service area, but it won't be a lot of variation.
If I'm 1000 m from the transmitter and move 1 m closer, the field gets 0.2% stronger.
With a system based on a large number of weaker transmitters those transmitters need to be closer to where the receivers are to get the same minimum field strength throughout the same area. But with people much closer to the transmitters, there will be a lot more variation in field strength as they move around, so it would be harder to make sure that no places where people commonly go stays under your maximum.
If I'm 100 m from the transmitter and move 1 m closer, the field gets 2% stronger.
If I'm 10 m from the transmitter and move 1 m closer, the field gets 23% stronger.