I have a decent understanding of the approach and would vote for #1. I’d say almost all applications of ML in physical sciences are #1. In contrast applying methods of statistical physics to understand how deep learning (as in DNN+SDG) works at all is a good example of #2.
I'm hardly the official judge of these things, but I would say it depends on how novel of an approach AlphaFold is to the problem. If it's a more efficient tool for doing the same things as before, I would put it towards the #1 end of the spectrum, unless it has also improved our basic understanding of folding or approaches to exploring the solution space of folded proteins, which would shift it towards #2.
Personally I don't know enough about AlphaFold or the problems of protein folding to be remotely confident in my judgment on it
Neural networks are differentiable regexes that can be trained from examples. In the alpha fold case, which is the case with a lot of bioinformatics actually, is that you don't need to know a lot about the biological domain to be successful in solving "data" problems in the field.