Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> Not eating and getting low blood sugar can much more simply answer the question of "Do I have free will, or am I a subject of my internal chemical reactions".

You cannot control what you feel, but you can control your actions.



Can you?

I'm a type 1 diabetic, I have a continuous monitor that tells me what my glucose level is. This monitors actual interstitial fluid level and not blood sugar, this means there can be a pretty large delay between what's measured just under the skin versus what my blood stream is experiencing. If a rapid drop occurs there can be a lead time that I'm unaware of it, but others are because I'll behave in a 'cranky' or 'sharp' tone. I'm not aware this is occurring when it does. A little sugar and a few minutes and I'm back to regular me again.

You are in far less control of your actions then you believe. Internal self awareness is a poor mirror of reality.


Well said.

> You are in far less control of your actions then you believe.

No. But I say this in a way that might surprise you?

I view 'free will', as most people understand it, as an illusion.

I recognize that I, at best, have a tiny fraction of control. If I consider "me" to be a locus of control w.r.t. volitional action, then I will grant that "me" has causal impact. But what causally impacted "me" at time t? A combination of "me" at time t-1 and my environment. (I'm a materialist who is agnostic w.r.t. determinism.)

So, what does this really mean? "I" (right now) can only have volitional control only if one ignores the precursors. It is a mindbending realization to many, but it is the best theory I've found that fits reality.

There are lots of us around. We might disagree on how we define 'free will'; we may or may not 'deny free will'. But there is a huge commonality among most rational materialists who accept modern science: we do not conceptualize free will in ways that match 'mainstream' notions of it.


The trouble with this argument is it applies to your thoughts about reality, not just to your will.

Your argument implies your thoughts at time t are solely the result of your state at time t-1 and your environment. But if that's the case, your thoughts aren't about reality; they're just a result of reality, a result of whatever state you happen to be in.

But if your thoughts aren't about reality, why should your argument about free will -- or about anything else -- carry any weight?

Causal processes are not the same as logical reasoning, and if you reduce the processes of the mind to the former, you remove the possibility of rational argument.


If you consider the processes of the mind to be acausal, that also removes the possibility of rational argument.

It's possible that I'm a Boltzmann brain with nonsensical memories about environmental stimuli that mislead me about what reality is, but it's tedious to prepend every statement about reality with "the conception of reality that my state at time t-1 and my environment have lead me to believe is the most likely candidate for reality."

That doesn't seem to be a particularly useful objection to the philosophy that GP espoused, instead, you have to assume his same priors - materialist, agnostic w.r.t. determinism, and some basic assumptions about senses and a shared reality.


I didn't claim the mind's processes were acausal; simply that they could not be reduced to cause, to the exclusion of anything else. If they can be so reduced, rational argument is impossible, because causal processes are not the same as rational argument. When I punch "2 + 2 =" into a calculator and the screen says "4", it's a causal process. When my kid grasps that 2 + 2 = 4, it's different. His mind has reasoned its way to this truth; it is not simply that the brain-state "2 + 2" has produced the further brain-state "4". If that were the case, he (and we) could never know that 2+2 resulted in 4; it could just as easily produce 5, or a green rabbit, or whatever else our mind had been caused to think at any given time.

If reasoning has any validity, it cannot be reduced to causal processes.

Therefore any argument -- any attempt at reasoning -- that claims that the mind's processes are merely causal undermines itself in the very process of being made.


> causal processes are not the same as rational argument

Yes.

...I'm taking this step-by-step...

But causal processes can _generate_ rational argument. Agree or disagree?


> But causal processes can _generate_ rational argument.

There is no reason for thinking this. A causal process can be associated and simultaneous with an argument, but it can't generate it, except by accident. The state of mind '2+2' may be followed by the state of mind '4', but it could just as easily be followed by the state of mind '5'. Causal processes may make us believe one or the other, but they will not make one correct and the other wrong; nor will they be sufficient to explain how one answer is correct and the other wrong.


That's why I said 'can'. I didn't say 'always'.

A deterministic computer program _can_ generate rational argument; e.g. it can use logical deduction a.k.a. forward-chaining.

Your claim seems to be that causality (governing the behavior of human thought and action) does not _necessarily_ result in rationality (in that human)? Of course -- this is obvious.

This seems like a miscommunication. Perhaps an unavoidable one? :)


> Your claim seems to be that causality (governing the behavior of human thought and action) does not _necessarily_ result in rationality (in that human)? Of course -- this is obvious.

That's not my claim. Basic observation of oneself or others make this, as you say, strikingly obvious :)

My claim is rather that causality can't be a sufficient explanation for rationality. By this I mean that we can't explain rational thought by referring to the cerebral states that are associated with it. Remember your original claim that 'free will' is an illusion, because what impacts me at time T is [limited to] my state at time T-1, and my environment.[0] I take this to mean that my state at T is caused solely by my state at T-1 and envt. IF you're right about this, AND IF I'm right that causality can't be a sufficient explanation for rationality, then it implies that rationality is impossible.

Why can't causality be a sufficient explanation for reality? We can't explain rational thought by referring to the cerebral states associated with it, because the question of whether a thought is rational is indepedent of its associated cerebral state. There are cerebral states associated with 2+2=4 (rational), and cerebral states associated with 2+2=5 (irrational). But neither of these cerebral states is itself right or wrong. Only the thoughts associated with them are right or wrong.

I know that 2+2=4. It's not just that I have a series of successive, causal states that make me think this; it's that the content of the thought is correct. My mind has grasped a truth about reality. Similarly, someone who thinks 2+2=5 is wrong. It's not just that he has a series of successive, causal states that make him think thus; it's that the content of his thought is incorrect, and his mind has failed to grasp a truth about reality. When we say thought is correct or incorrect, rational or irrational, we refer to the content of the thought, not the brain state.

Thought must be about something other than the brain state to be right or wrong. But if thought is entirely generated, and sufficiently explained, by causal processes, it can only reflect the brain state, and can't be right or wrong. Since thought can be right or wrong, it follows that rational thought is something more than the product of causal processes, and is (at least partially) independent of them.

[0] I have added the words in square brackets myself, but I think that's your meaning.


To expand the conversation a bit with the hopes of breaking out of what seems to be some kind of language trap ... Let's talk about how a person and a computer can arrive at / prove truth.

A person's brain can generate true concepts in many ways. One way is careful logical thinking based on true premises. Another is some other manner of thinking (pick whatever you like) which ends up being true.

Speaking of combinatorics and algorithms now, there are many ways to validate truth. It depends of course on the style of logic in play. Depending on the set of logical primitives available (e.g. modus ponens), there are many different computational pathways get from a set of premises to a conclusion. In other words, there are many ways to prove the Pythagorean Theorem.

I don't think we'll disagree. But maybe? Or maybe we'll get clear on some language barrier?


Thanks for giving it another try.

> My claim is rather that causality can't be a sufficient explanation for rationality. By this I mean that we can't explain rational thought by referring to the cerebral states that are associated with it.

I'm zooming in on "explain" here. And I'm not getting it.

I'm not agreeing or disagreeing -- I'm not even following -- and I have some guesses as to why:

1. Are you offering a critique of the lack of free will argument I made?

2. If so, I'm not able to figure out where it lands: for? against? something else?

3. Either way, I haven't yet figured out what you mean by "explain" here.

4. Most broadly, I'm not seeing why this matters. I want to understand, but I haven't yet figured out how to get there.

If you are laying out a known philosophical position, could you please point me to a resource that explains it?


> We can't explain rational thought by referring to the cerebral states associated with it, because the question of whether a thought is rational is [independent] of its associated cerebral state.

If you had to write this over again to a broader audience, would you choose the word _explain_ here? In the context of this conversation especially, 'explain' is highly overloaded. Which sense do you mean? Perhaps what you mean is closer to _assess_ or _prove_?

Are you essentially saying this: the rationality of a statement is _assessed_ by the degree to which it adheres to the logic of rationality?

Since this is obvious, I'm inclined to think I'm still not following what you are actually trying to get across.


Ok, I think I'm working my way towards the part of the argumentation that has a bearing on free will.

> But if thought is entirely generated, and sufficiently explained, by causal processes, it can only reflect the brain state, and can't be right or wrong.

Not so. The above claim is confusing two things: (1) how a thought is generated (causality); (2) what a thought represents (conceptualization)

Whether a conceptualization is true or false is _independent_ of its causal origins.

This is obvious to me. But not to you? Or is more miscommunication afoot?


> When we say thought is correct or incorrect, rational or irrational, we refer to the content of the thought, not the brain state.

Yes, I agree. You've setup an example that emphasizes _conceptual_ understanding of what a person is thinking _about_.

This part is obvious too. But you want to take it further. I'm trying to puzzle that out next.


> Causal processes are not the same as logical reasoning, and if you reduce the processes of the mind to the former, you remove the possibility of rational argument.

(I'm not sure if this applies to what I was saying or not; I'll put that aside.)

I'll rephrase the comment above to help ensure I'm talking about the same thing. The claim, as I understand it, goes like this: if one believes the mind is governed by causal processes, then there is no possibility of the brain doing logical reasoning.

Am I understanding the argument as you intended?

I'll proceed with my understanding of it... I don't buy it. Computers are causal and can do logical reasoning just fine.


> Your argument implies your thoughts at time t are solely the result of your state at time t-1 and your environment.

The laws of physics also matter.

I think we're still on the same page here.

I'm trying to figure out if we disagree, and if so, where.


> But if that's the case, your thoughts aren't about reality; they're just a result of reality, a result of whatever state you happen to be in.

According to my premises (materialism being the key one), thoughts are the result of the combination of (a) what happened before and (b) the laws of the universe.

So thoughts can be both: (1) a causal result (previous paragraph) and (2) _about_ reality, from the point of view of one's consciousness.

Any disagreement here?


I consider free will fundamentally an illusion but at the abstraction level of 'me' (also a faulty abstraction) it does make sense to use these concepts sometimes.

But it's important not to confuse the map with the territory.


yes, a prisoner may be locked inside and have a laid out routine for them, but they do still have a say in the minutiae and their small actions could actually echo into the future.

Another thought, just because our unconscious mind also plays a part, people fear that this implies mostly determinism but really why are we considering that as not also our thoughts too? They're just thoughts that our deeper mind has put forth and our deeper mind doesn't usually want to be questioned, they've got more hard data. Still you can fight this or things like suicide, soldiers. daredevils wouldn't be possible. We just don't get to see how it came to conclusions so we assume it's some hard-wired code.


It's simpler than that. Your thoughts -- the internal monologue deciding what actions to take -- are the result of physical processes in the brain. There is no way those non-physical thoughts are changing the physical processes; they happen after the fact. We have the illusion of being in the driver seat but really we are just along for the ride.

This is assuming you adhere only to what science can prove, and not religious beliefs like the soul.

But yeah, the fact that we are the result of past decisions and outside inputs going all the way back to before we even had consciousness at all is also a nice proof by induction.


> There is no way those non-physical thoughts are changing the physical processes

I don’t believe this to be correct. There is nothing non-physical about thoughts. Thoughts are mater interacting with mater.

Thoughts are to the brain what waves are to the ocean. They are not less physical than the brain itself.


Inasmuch as thoughts have a one-to-one coupling with the physical processes in the brain, then we are saying the same thing.

But qualia (the more accurate/descriptive term for what we are discussing) are inherently subjective and non-physical by definition. Both of us can observe an ocean wave using many different methods (direct observation with light, their effect on a buoy, their impact with the shore, etc). However we can never both observe your qualia. I have no way to determine that your perception of "red" and mine are equivalent. There is no way you can explain vision to a blind person; yet a blind person can have their own understanding of an ocean wave.


You cannot point to a neurotransmitter or neuron or synapse and say "that's a thought." Thoughts are not material. You might say they are not real. Even if you can point to a group of neurons and processes in the brain and associate it with a particular thought, the thought itself is not real. The neurons and their behaviors are.


First, saying "x is not real" is a proven way to make a mess of a conversation. I suggest we all would be better off avoiding that phrasing, unless we're willing to really dig in and carefully clarify our definitions.

> You cannot point to a neurotransmitter or neuron or synapse and say "that's a thought."

Are you claiming it is (a) categorically impossible or (b) practically impossible? Why not?

> Thoughts are not material.

To be clear, when modern philosophers talk about 'materialism' that tends to include all of the abstractions of physics: matter, energy, waves, etc.

> You might say [thoughts] are not real.

I wouldn't say that. Here is what I would say; there are at least three key different aspects of thoughts:

1. the perceptual level; i.e. the _experience_ of having a thought

2. the conceptual level; i.e. a _concept_ that people use

3. the physical level; the physical things that are happening; may be a set of things; not necessarily contiguous

> Even if you can point to a group of neurons and processes in the brain and associate it with a particular thought, the thought itself is not real.

My top-most paragraph explains that "X is not real" statements tend to have a lot of downsides. Am I missing something? Does this add something to the discussion?


FWIW, when I said "thoughts" in the original comment, I really meant qualia / inner monologue / phenomenological experience, i.e. what you call #1 the perceptual level. Qualia are not part of materialism... which IMO makes them the most interesting thing in the universe, since they are categorically unlike everything else and yet so fundamental to our being.


I mean, yea, maybe in your world you don't want to point at an electromagnetic wave that is a wifi signal... but it is.

Signals are just as real as the components that create them.


Qualia are not a signal. You can't transmit and receive them, you can't physically detect them, there is no way to objectively describe them. When you and I both look at 450nm light there is no hope of ever knowing what the other perceives. You can say "the color of the sky on a clear day" but that is just a reference to yet more subjective experience.


> There is no way those non-physical thoughts are changing the physical processes; they happen after the fact. We have the illusion of being in the driver seat but really we are just along for the ride.

How can you conclude this with any certainty?


PBS: Your Brain: Perception Deception

https://www.youtube.com/watch?v=HU6LfXNeQM4

PBS: Your Brain: Who's in Control?

https://www.youtube.com/watch?v=yQ6VOOd73MA

Rationalization is mostly post ad hoc.


There are other interpretations of some of these experiments (e.g. regarding the action potential). In his book on free will, Mark Balaguer elaborates a little bit. I don't remember the details well, but I'll try to sketch what I think he says.

First, in my recollection, Balaguer points out the form of the argument as follows.

Given that: (1) a particular study shows that one's (perception of when they decided to act) _lags_ the (experimental measurement of the person's action potential spiking), what can we validly conclude?

Some people then claim that (2) claims of making a decision are merely post-hoc rationalizations. I get the sense that such a view is widely held among those who have heard of the experiments -- or at least the popular characterizations of them.

Balaguer says e.g. "not so fast". He points out that we need to talk about the logically necessary steps to reason from (1) to (2). He has a section on this; he claims it isn't as watertight as some think.

As I recall, part of the discussion has to do with motor planning.

Another part is this: there could be a volitional choice that precedes and causes both the action potential spike _and_ the perception/recognition. That volitional choice is unavailable to conscious awareness until some time later, presumably. If true, a person could have made the choice, noticed it later, and still be consistent with the experimental findings.

Apologies for the hazy recollection. I recall not being strongly convinced, partly because I wasn't impressed by the book overall, but I also haven't dug into these topics as much as I would like.

Lucky for us, the issue of latency between action and perception is squarely in the wheelhouse of distributed systems engineers!


> There is no way those non-physical thoughts are changing the physical processes; they happen after the fact.

How can the truth of this statement be proven? Non-physical thoughts cannot be observed, so they are out of reach of being tested for being correct.


Eventually we may have sufficiently fine-grained ability to observe brain states, plus a strong enough understanding of neuropsychology, such that an outside observer can accurately predict, for a brain under observation:

1. The subjective internal mental state of the participant (e.g. "you were imagining a red balloon")

2. Actions that will be taken before they physically manifest (e.g. "you will throw rock instead of scissors")

If an outside observer can do these things without access to subjects' thoughts, especially with subjects' deliberate efforts to fool the observers, then IMO it is pretty clear evidence the thoughts are meaningless.

As for the current day, I say the preponderance of evidence is on anyone who claims otherwise. There's lots of evidence suggesting free will is an illusion, and no evidence of telekinesis or the like.


Not even eventually, currently we have AI that can decode 'pictures' from our brain activity.

https://www.smithsonianmag.com/smart-news/this-ai-used-brain...


> Non-physical thoughts cannot be observed

We wouldn't even be able to talk about them, because if we did, there would be a causal chain starting with a non-psysical thought and ending with moving a (physical) tongue. So somewhere along that chain, a non-physical thing would need to make a physical thing move. How exactly?


Neural activity physically exists. Qualia -- the experience of having thoughts -- do not.


I agree that free will functionally doesn't exist. However for the purposes of social organization I see no better alternative than to treat each person as an agent and hold them fully accountable for their actions, regardless of upstream "causal" impacts.

If you're starving/high/whatever and you violate someone else's fundamental rights, the reasons/excuses you give as to why your control was compromised seem irrelevant. Actions and outcomes count, not intentions.


> If you're starving/high/whatever and you violate someone else's fundamental rights, the reasons/excuses you give as to why your control was compromised seem irrelevant. > If you're starving/high/whatever and you violate someone else's fundamental rights, the reasons/excuses you give as to why your control was compromised seem irrelevant. Actions and outcomes count, not intentions.

Yes, people _often_ people give rationalizations of dubious merit. These are muddled, complicated 'reflections' of the full reality. So it is wise to not give them too much weight. (But it can be very interesting to try to parse them, but that's another topic...!)

Next point. From a predictive point of view about public safety, the context and situation matters. For example, consider the case of an addict who regularly steals because he is driven by the addiction. If he can break the habit (hopefully with help of many kinds), the theft problem largely goes away. Understanding the dynamic helps us understand downstream outcomes.


> Actions and outcomes count, not intentions.

There is a whole division in law around mens rea vs actus reus that disagrees with this take.


I'm aware, but the bar for a guilty mind is quite low. You basically have to have known what you are doing, which we assume just about every conscious person does. There are also crimes where mens rea isn't even applied to a perpetrator, but to a reasonable objective analyzer (eg. manslaughter).


> I'm aware, but the bar for a guilty mind is quite low.

Do you mean this in terms of a cross-system comparison (i.e. variation between legal systems)?

Do you mean this in terms of some philosophy that lays out a more sensible stance?

I've been digging into this somewhat; happy to learn more.


My mother was a T1 diabetic back in the days where insulin injections were basically a best guess based on what you had eaten and planned to eat. It was always fun as a child not knowing whether my little screw-ups would bring an exasperated sigh and a chuckle or the low-glucose demon.

I'm so happy that managing your disease is much more straightforward than it was in the 70s or 80s. It also appears that life expectancies are much higher since it's easier to avoid the extreme roller coaster of too high and too low blood sugar. Best wishes.


My wife is also T1, diagnosed a couple years back with a week in ICU. She pulled through and now has a CGM and pump.

What really surprised me was her blood sugar rockets up when she has video meetings with a certain difficult colleague. While other chilled colleagues have no such effect.

That made me wonder if interacting with difficult people causes more physiological changes than I realised.


Epinephrine causes the body to release sugar into the bloodstream - it's one of the reasons you get shaky when your blood sugar gets low (if you're not T1DM at least) - your body is attempting to increase blood sugar by epinephrine. So in reverse, stressful situations that cause the release of epinephrine increase blood sugar.

There's a similar issue with e.g. running as a T1DM (as I understand it, not being one) - when you're running, your body will pump out sugar, but when you stop running, it doesn't stop instantly, so your blood sugar can spike high post-exercise. Or you can run out of sugar and crash hypoglycemic.

It's amazing that CGMs exist that can, to some degree, compensate for these things, but man the body's autoregulation on 50 different axes is fascinating.


There are observable changes in the structure of the brain when people take up meditation. Not to get too crass, but we are the meat in our heads and bodies.


I'm not into mediation, but I watched something that said the brain scans of meditating experts resembled someone having a seizure or something. But the person with sat quietly.

I'm not drawing any conclusions. But I found it fascinating.


Of course you can control your actions. But that doesn't mean they can't be influenced by external factors like drugs, weather, or in your case low blood sugar.

You don't have absolute total free will in all situations. That's why development of willpower is such a huge discipline (religion, meditation, etc).

There's a range where your blood sugar starts to influence your actions, with more willpower and cultivating a less cranky personality you could certainly extend that range.

Some people have similar non-beneficail reactions to high stress situations, other people develop the skill to abstain from those actions and maintain control in those situations.


Willpower occurs after awareness. They are objectively different things.

If you are not aware of something, you cannot choose to make a reaction for or against it. For example, imagine that your body did not detect heat. If you put your hand on a burner you would instinctively withdraw from it (and yes this happens in people that do not feel pain).

Willpower is in the ballpark of conscious awareness. "I choose to do X or Y"

Training on the other hand attempts to remove the conscious decision part from the mind after awareness occurs. "If X then Y", this way you're not wasting time and brainpower trying to figure out what's going on.


I'm not sure what you are trying to say, but free will, I think, requires consciousness. If you want to train to keep your hand on a burner wouldn't that be free will. Are we implying that because we can sense pain and other things that we don't have free will. Free will doesn't mean you are omnipotent. You can choose to put your hand on a burner and you can choose to practice keeping it there, both are examples of free will. An instinct to avoid pain isn't negating the free ability to make choices.


> You cannot control what you feel, but you can control your actions.

I'll respond in a way that I hope is clear and relatively objective. This will take the form of an "if-then" claim nested within another "if-then" claim.

IF a person

  (1) accepts philosophical materialism, but
  (2) remains agnostic as to quantum effects at macro scale
THEN then the following response follows logically:

It depends what you mean by "you". If you (a) mean that "you" consists of the body, including the brain; and (b) you recognize there is a physiological difference between _non-volitional_ and _volitional_ choices; then (c) yes, loosely speaking, "you" can control your volitional actions.

If you think carefully, you will recognize the above assumptions also mean:

- a person's surrounding context plays an essential role in shaping the kinds of actions that are available.

- there are no causes other than material ones (such as a soul)

You may notice I didn't use the phrase "free will" above. That is very much intentional. Even if you are a philosopher, that word is nearly impossible to use in a way with a definition that a group of people can agree on.

See also: "Volition and Action in the Human Brain: Processes, Pathologies, and Reasons" by Itzhak Fried, Patrick Haggard, ... https://www.ncbi.nlm.nih.gov/pmc/articles/PMC5678016/


I think there's two conceptions of free will, one is as you described -- a body being able to act in and of itself without any influence of an external force (beyond sensation, physical constraints, etc).

That sort of punts on the question of physical determinism, where it doesn't matter if it was possible for you to have done anything other than what you did. You could be a pure clock work automaton making deterministic responses to external stimuli and still have that sort of "free will", but I don't think that would be satisfying to a lot of people.

The second kind of free will, I think would be the ability to have a free choice -- that there is some immaterial entity which can act in somehow a non-deterministic way to choose its behavior, that the course of your entire life wasn't set at the big bang -- that not only can you act independently, but that your choices _matter_ and will result in a future that was not predetermined. I think that is what a lot of people would think of as free will, and it's difficult to define or logically support IMO.


I can appreciate a lot of the sentiments above, but I don't think the comment's "two conceptions" really 'decomplects' (a Hickeyism) the key ideas. If anything, it smooshes a lot of them together: different definitions of free will, meaning, origin stories, the divine, flavors of determinism, and more.


Note that non-material causes can be just as deterministic (if not more so) as matter. Logic and math are physical but not material, for instance.


> Logic and math are physical but not material, for instance.

Logic and math are neither physical, nor material, nor causal.


I think some might be slightly troubled by hearing that logic is not causal. But I agree.

One can _test_ a set of statements to see if they are valid logical deductions from another set of statements. Even if true, this does not mean that the deducible statements are _caused_.

In some cases, that statement from above (the deduced one) might be 'realized' (noticed by people) before its premises are!

We might even be able to logically prove it is true without the other set of statements! Why? There are multiple logical paths (not always mapping to reality) that _could_ prove a particular statement.


They are physical causes but they are abstracted from matter. The underlying cause originates from the math/logic, not the material instantiation. Physicalism is often conflated with materialism, but the world is physically constituted by more than matter!

A lot of this world is obviously caused by the inherent formal influence of math and logic. There are ongoing mysteries in this, to be sure, but these forms are beyond matter. I’m speaking from the Pythagorean-Platonic-Tegmark school of thought. I’m trying to address the typical atheist layperson scientist who hasn’t thought through immaterial reality (rejecting it as supernatural, thinking it involves ghosts or spirits or something). I know there are reasonable counter positions to argue why math isn’t real, but I don’t subscribe.


> They are physical causes but they are abstracted from matter.

That sequence of words doesn't even denote a coherent concept. To be physical is exactly to be a state of matter/energy.

> The underlying cause originates from the math/logic, not the material instantiation.

No, logic, including math, describes the relationship between abstract concepts. It doesn't cause anything. Implication is not causation.

> I’m trying to address the typical atheist layperson scientist who hasn’t thought through immaterial reality

I'm Catholic, not atheist, and I have no problem with immaterial reality, or even immaterial causes of material effects. But I do have a problem with the simple error of describing immaterial concepts as being physical causes, or with describing systems of concepts and oc describing their relations as causes of anything, except insofar as belief in them causes behavior in the believer.

> I know there are reasonable counter positions to argue why math isn’t real, but I don’t subscribe.

There are certainly senses in which “math is real” is a valid or defensible statement, but bot in the sense of it being a physical thing which can exist as a cause.


Most physicists don’t see the world as “made” of stuff, either matter or energy. But algorithmic laws. The ontology of modern physics is weird.

Physical refers to natural causes. How could math not be causal? Not like, math in a textbook (though I’d argue that is causal too. Ideas are immaterial but they cause effects in the material world. They are thus natural, physical causes!).

Maybe you’d prefer the word “necessitate” over cause? Mathematical and logic necessitate certain effects in material reality? Personally I don’t see a difference, but maybe you do.


When living things or machines do math they are doing a physical thing.

The abstract concept of mathematics doesn’t need to exist for that to happen.


You have clearly never met someone having a hypoglycemic event. It can absolutely change their personality, as significantly as a head injury.


The sound of breaking dishes in the kitchen, a trail of damaged but unopened candy wrappers, sitting on the floor next to the cabinets with their doors ripped off, with the sugar bowl half-spilled, thankfully recovering with some sugar under their tongue and no major bodily injury. Terrifying.


Hyperglycemia does too. (High blood sugar.) It’s known as diabetic rage. A number of things are happening there. Part of it is brain inflammation – experienced as suffering – and brain inflammation is directly behavior-modifying, causing aggression.

And indeed that’s what happens with traumatic brain injuries.

The same mechanism is behind the aggression sometimes displayed by cocaine users. Cocaine tamps down brain inflammation short-term, increases it long-term.

And indeed, people with traumatic brain injuries and central nervous system inflammation tend to have more problems with cocaine abuse.


I assume you have no bad habits?

Can you instantly get back to sleep?

You never snap at anyone?

Diet is 100% what you want it to be?

Exercise on time daily?

Read HN only as much as you know it benefits you?

Marine you’re the 0.0001% who has all this down but most don’t. I’m sure there are many other examples of how most people barely control our minds and actions.


Uncontrolled aggression is one of common symptoms of low blood sugar. If you are not diabetic it should not just happen, your body should deal with situation. But it does happen to diabetics and they have to watch out for that.


In theory that sounds great. Unfortunately the idea that humans are highly rational is over rated. We are first and foremost, for better or worse, emotional beings.

Your fame of mind (i.e., feelings) plays a role in decision making, which drives action.

In short, beliefs (to which emotions are connected) drive behavior.


ADHD...




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: