I worked in astronomy software for a few years for a different telescope, the LSST. I am not an expert, but I was in this world enough to answer.
The short version - it converges faster (probably like 5-10x faster), but also (as everyone else said) works in different wavelengths.
You can think of a telescope as a "photon bucket." The number of photons it collects is proportional to the area of the aperture. Webb's aperture area is 25.4 square meters, while Hubble's is 4 square meters, so roughly speaking JWST will get photons about 6 times quicker than Hubble.
But that's only the roughest measure. Once you've got the photons, what do you do with them? You send them to a detector. There's loss in this process - you bounce off of mirrors, with some small loss. You pass through band filters to isolate particular colors, which have more loss. The detector itself has an efficiency; in CCD cameras people speak of "quantum efficiency" - the probability that a photon induces a charge that can be counted when you read out the chip. That quantum efficiency depends on the photon's wavelength.
Furthermore - the longer your exposure, the more cosmic rays you get which corrupt pixels. You can flush the CCD more often and detect the cosmic rays and eliminate them, but you'll eventually brush against the CCD's read-out noise, which is a "tax" of noise you get every time you read out data.
So this all get's complicated! People spend many years characterizing detection capabilities of these instruments, and write many pages on them.
HST's camera is more complicated to characterize, partly because it's older. Radiation has damaged and degraded many of the components so they have a lot of noise. The details of how this works are at the edge of human knowledge, so we don't have a great model for them. From the STIS handbook:
Radiation damage at the altitude of the HST orbit causes the charge transfer efficiency (CTE) of the STIS CCD to degrade with time. The effect of imperfect CTE is the loss of signal when charge is transferred through the CCD chip during the readout process. As the nominal read-out amplifier (Amp D) is situated at the top right corner of the STIS CCD, the CTE problem has two possible observational consequences: (1) making objects at lower row numbers (more pixel-to-pixel charge transfers) appear fainter than they would if they were at high row numbers (since this loss is suffered along the parallel clocking direction, it is referred to as parallel CTE loss); and (2) making objects on the left side of the chip appear fainter than on the right side (referred to as serial CTE loss). In the case of the STIS CCD, the serial CTE loss has been found to be negligible for practical purposes. Hence we will only address parallel CTE loss for the STIS CCD in this Handbook.
The current lack of a comprehensive theoretical understanding of CTE effects introduces an uncertainty for STIS photometry.
Now - this was all about how many photons you collect. When humans look at an image, they also care a lot about how fine the details are on it. This has to do with the resolution of the telescope's imaging systems. Resolution is limited by the number of pixels on the detector, and (to a much lesser extent) by the optical train of the telescope - the aberrations and distortions introduced by mirrors that focus light onto the detector's pixels.
Hubble has a high-res camera, and a separate wide-angle camera. Hubble's high-res camera actually outperforms JWST - it can resolve down to 0.04 arcsec, while JWST's can go to around 0.1 arcsec. But JWST's camera has a much wider field of view.
The short version - it converges faster (probably like 5-10x faster), but also (as everyone else said) works in different wavelengths.
You can think of a telescope as a "photon bucket." The number of photons it collects is proportional to the area of the aperture. Webb's aperture area is 25.4 square meters, while Hubble's is 4 square meters, so roughly speaking JWST will get photons about 6 times quicker than Hubble.
But that's only the roughest measure. Once you've got the photons, what do you do with them? You send them to a detector. There's loss in this process - you bounce off of mirrors, with some small loss. You pass through band filters to isolate particular colors, which have more loss. The detector itself has an efficiency; in CCD cameras people speak of "quantum efficiency" - the probability that a photon induces a charge that can be counted when you read out the chip. That quantum efficiency depends on the photon's wavelength.
Furthermore - the longer your exposure, the more cosmic rays you get which corrupt pixels. You can flush the CCD more often and detect the cosmic rays and eliminate them, but you'll eventually brush against the CCD's read-out noise, which is a "tax" of noise you get every time you read out data.
So this all get's complicated! People spend many years characterizing detection capabilities of these instruments, and write many pages on them.
JWST's capabilities are described here: https://jwst-docs.stsci.edu/jwst-near-infrared-camera/nircam...
HST's camera is more complicated to characterize, partly because it's older. Radiation has damaged and degraded many of the components so they have a lot of noise. The details of how this works are at the edge of human knowledge, so we don't have a great model for them. From the STIS handbook:
Now - this was all about how many photons you collect. When humans look at an image, they also care a lot about how fine the details are on it. This has to do with the resolution of the telescope's imaging systems. Resolution is limited by the number of pixels on the detector, and (to a much lesser extent) by the optical train of the telescope - the aberrations and distortions introduced by mirrors that focus light onto the detector's pixels.Hubble has a high-res camera, and a separate wide-angle camera. Hubble's high-res camera actually outperforms JWST - it can resolve down to 0.04 arcsec, while JWST's can go to around 0.1 arcsec. But JWST's camera has a much wider field of view.