Sure, whatever - the point is that they're different objects, and results about one are not results about the other.
> What do you mean by “restricted” when you are characterising a function?
I mean whatever the word "function" means in a given setting. A classical function X -> Y is a relation such that if `(x, y1)` and `(x, y2)` hold then `y1 = y2`. A function in intuitionistic type theory is a well-typed lambda term. A function in the categorical semantics of a type theory is an exponential object. And so on.
> Show me the decider… for “classical” and “non-classical” objects.
No such thing: it's a metatheoretical judgement, not a theorem. Same story as type errors: within the language, `stringToUpperCase (5 :: Int) :: String` isn't a false statement, it's just inexpressible nonsense. There is no such object as `stringToUpperCase 5` and so nothing can be said about it internally. When we talk about it, we're talking, from outside the language, about a syntactic construct.
> That is the definition of information; is it not? The answer to a yes/no questions
No. Self-information of a given outcome with respect to a random variable, which is probably the most common sense of the word, is the negative log of its probability. Shannon entropy, also often called information, is the expected self-information of a random variable. Mutual information is the KL divergence of a joint distribution and the product of the respective marginals. There are other notions.
>Sure, whatever - the point is that they're different objects, and results about one are not results about the other.
Sorry, I don't understand what you mean by "same" and "different".
By "same" do you mean =(x,y).
And by "different" do you mean ¬=(x,y)
Or do you mean something like same(x,y) = { 1 if ??? else 0 }
>A classical function X -> Y is a relation such that if `(x, y1)` and `(x, y2)` hold then `y1 = y2
You seem to be confusing syntax and semantics here, and the infix notation isn't helping...
What does =(y1, y2) mean?
What does =(x,x) mean?
>No such thing: it's a metatheoretical judgement, not a theorem.
What do you mean? Judgments are first-class citizens in Univalent Mathematics. a:R is a judgment that object a is of type R. This literally means "a proves R", and the particular expression "first class citizen" has well-understood meaning in Programming Language design ( https://en.wikipedia.org/wiki/First-class_citizen ).
>`stringToUpperCase (5 :: Int) :: String` isn't a false statement, it's just inexpressible nonsense.
So I must be a miracle worker then? Expressing the inexpressible nonsense...
In [1]: def stringToUpperCase(x): return(str(x).upper())
In [2]: stringToUpperCase(int(5))
Out[2]: '5'
>No. Self-information of a given outcome with respect to a random variable, which is probably the most common sense of the word.
Speaking of randomness in a classical setting, this function exists, right?
Sure, whatever - the point is that they're different objects, and results about one are not results about the other.
> What do you mean by “restricted” when you are characterising a function?
I mean whatever the word "function" means in a given setting. A classical function X -> Y is a relation such that if `(x, y1)` and `(x, y2)` hold then `y1 = y2`. A function in intuitionistic type theory is a well-typed lambda term. A function in the categorical semantics of a type theory is an exponential object. And so on.
> Show me the decider… for “classical” and “non-classical” objects.
No such thing: it's a metatheoretical judgement, not a theorem. Same story as type errors: within the language, `stringToUpperCase (5 :: Int) :: String` isn't a false statement, it's just inexpressible nonsense. There is no such object as `stringToUpperCase 5` and so nothing can be said about it internally. When we talk about it, we're talking, from outside the language, about a syntactic construct.
> That is the definition of information; is it not? The answer to a yes/no questions
No. Self-information of a given outcome with respect to a random variable, which is probably the most common sense of the word, is the negative log of its probability. Shannon entropy, also often called information, is the expected self-information of a random variable. Mutual information is the KL divergence of a joint distribution and the product of the respective marginals. There are other notions.