There's quite a lot of experimental evidence in the "purely physical phenomenon" direction. Things like brain surgery, neural implants and mind altering drugs probably wouldn't work as well if they were trying to interact with your non physical spirit.
I definitely see that. It's pretty compelling. You can't deny that certain drugs, for example, are a lever you can pull that makes consciousness go away (or come back). There seems to be a cause-effect relationship there.
But at the same time, I do not understand how it's possible that the consciousness that I subjectively experience can arise from physical processes. Therefore, I have difficulty completely accepting it. I write software. It processes information. I don't believe that the CPU has this same subjective consciousness experience (not even a little) while it's running my for loops and if statements. Suppose I were a genius and figured out an algorithm so that the CPU can process information in a way equivalent to the human brain. Would it have consciousness then? What changed? Does whether it has consciousness depend which algorithm it's executing? Quicksort no, but brain-emulator algorithm yes? They're both just algorithms, so why should the answer be different?
One explanation I've heard is it could be a matter of scale: simple information processing doesn't create consciousness, but sufficiently complex processing does. I can't say that's not true, but it seems hand-wavy and too convenient. Over here we have something that is neither conscious nor complex, and over there we have both conscious and complex, so we'll just say that complexity is the variable that determines consciousness without any further explanation. I realize at some point science works that way: we observe that when this thing happens, this other thing follows, according to this pattern we can characterize (with equations), and we can't get too deep into the why of it, and it's just opaque, and we describe it and call it a "law". Which is fine, but are we saying that this is a law? I'm not necessarily rejecting this idea, but the main argument in favor of it seems to be that it needs to be this way to make the other stuff work out.
Another possible way to reconcile things is the idea that everything is conscious. It certainly gets you out of the problem of explaining how certain groups of atoms banging around in one pattern (like in your brain) "attract" consciousness but other groups of atoms banging around in other patterns don't. You just say they all do, and you no longer need to explain a difference because there isn't one. Nice and simple, but it has some jarring (to me) implications that things around me are conscious that I normally assume aren't. It also has some questions about how it's organized, like why consciousness seems to be separated into entities.
Anyway, there are also other ways of looking at it. My main point here is that it's certainly something I don't understand well, and possibly it is something that nobody has a truly satisfying answer for.
If you do not understand what you mean yourself by the word "consciousness" then it is futile to ask whether an object has that property.
For example the purpose of anesthesia the goal can be broken down into several sub-components that you need to turn off without ever invoking the concept of consciousness: wakefulness, memory-formation, sensory input (pain).
Similarly consciousness seems to be a grab-bag of fuzzy properties that we ascribe to humans and then by letting five be even we also allow a few other species to roughly match (some) of those properties if we squint. And since humans and other, clearly somewhat simpler, species are clearly conscious then we go on and declare it's a really difficult thing to understand how ever-simpler things can't be conscious. It's just the paradox of the heap.
This doesn't mean consciousness is magical. It's just a very poorly defined and overloaded concept almost bordering on useless in the general case. It may feel like magic because we built this thought-edifice that twists to escape our grasp. But to me that seems more like a philosophical mirror cabinet that distracts from looking at the actual problem.
If you want to ask whether something is conscious you first need to come up with a rigorous testable definition or break it down into smaller components which you can detach from the overloaded concept of consciousness.