00:00:22.800
be here too um we are going to be talking about the why combinator this morning how many people here are very
00:00:29.400
familiar with with the Y combinator just a just a very few people
00:00:34.520
very good that means I can say anything I want to about it I've got extensive speaking
00:00:42.320
experience and I've learned a couple rules as particularly about keynote talks I mean Keynotes are just a little
00:00:48.920
bit different than your regular everyday session for one thing Keynotes tend to
00:00:54.719
be a little less technical you want to inspire the folks that that are
00:01:00.199
attending you want to motivate them so you tend to stay away from the highly technical talks I am glad to say this
00:01:12.759
really technical then maybe you should at least make what you're talking about relevant to the daytoday life of of the
00:01:21.280
programmers there so they can take home something that they can use in their you know in in their work you you know use
00:01:27.759
it tomorrow I am glad to say that this talk is extremely pointless and
00:01:33.119
there's there's very little in this that you can take home and
00:01:38.399
use well if you're not going to be relevant and you're uh going to be
00:01:45.680
highly technical then at least if you're doing code examples make it examples of good code good coding practices so
00:01:52.840
you're instilling those good practices in the people so that when they get home at least they've got something out of
00:01:58.479
that I am also happy to say this will be the worst Ruby code you've probably ever
00:02:04.880
seen so what are the redeeming factors of this talk well all I can say it's going to be a little bit of
00:02:14.480
fun the first thing I need to do is I'm going to need a little bit of audience participation so I need someone who has
00:02:22.760
a calculator that can calculate the cosine of an angle in radians do I have
00:02:28.400
any people who might have a calculator to do that anybody any volunteers
00:02:35.080
here oh oh way in the back there okay yes yes okay you got a calculator in there
00:02:41.599
go ahead and pull it
00:02:47.159
out oh good he's he's coming up here excellent
00:02:54.920
excellent okay all right I want you to type in the number zero yep make sure sure you are in radians mode for cosine
00:03:03.480
okay and then hit the cosine button yep okay you got an answer I do okay hit the
00:03:09.280
cosine button again okay hit it
00:03:14.319
again hit it again keep doing that till I get back to
00:03:22.440
you okay the topic of my talk is called effectively computable that's what we're
00:03:27.959
going to be talking about and this was a topic that was explored by a fellow named Alan touring you may have heard of
00:03:35.159
touring he invented something called a touring machine now touring machine is a simple machine that has an infinite tape
00:03:42.760
with cells on it those cells you can write characters in that into those cells and uh you can then read back the
00:03:49.920
cell you can erase the cell and write a new character into it and you can move the tape one position at a time and all
00:03:55.599
those functions are controlled by a very very very simple-minded program that sits and can say if the current cell is
00:04:02.959
a one then then shift left and go to this you know so very very simpleminded programming in fact here is an here is a
00:04:11.120
animation of a touring machine I so want one of these
00:04:21.639
devices so if you're watching if you're watching here there is a camera here
00:04:27.000
with a light and you can see it's reading these characters right right there it's reading the one it's reading
00:04:32.160
that one there's an eraser head over here that he can erase that one and then
00:04:38.000
write a zero in place of it this is a counter he initialized he initialized it with the number
00:04:43.320
11 and he's uh now we're up to
00:04:49.400
12 now we're up to 13 and it's a very simple program that
00:04:56.120
just reads the character under the camera and decides to whether change it to a one or zero and carries the one
00:05:02.160
over and it's doing binary counting so we are up to what that's looks like 14 now now we're up to
00:05:17.160
15 now we're carrying the one all the way up erases the
00:05:24.639
one and now it's running real time it was actually running in fast speed before this is real time now sees a zero
00:05:30.840
reads a zero all reads the one erases it writes in the zero it's going to
00:05:37.680
carry the one so it reads over until it finds a blank space and then it carries the one over into the blank space and
00:05:44.319
the program's done that is a touring
00:05:51.240
machine can somebody build me one of those that's awesome Turing said that anything that
00:05:58.840
is effectively computable can be computed by this really dirt simple
00:06:04.560
machine this touring machine and and what do you meant by anything that can be computed he he's generally generally
00:06:10.120
talking about mathematical functions or or logic problems or things like that anything that you can write an algorithm
00:06:15.440
to do you can write that algorithm in a touring machine and have your touring machine calculated out that's kind of
00:06:22.599
mind-boggling and I look at the date of on this this is in the 1936 1937 time
00:06:28.080
frame the very first programmable computer was probably invented in Germany in around 1936 so he's doing all
00:06:35.680
this research in his head on paper uh there's no computers that he's using to
00:06:41.400
do any of this search this stuff with at all so this is actually pre computer
00:06:46.440
work that's being done around the same time frame there's this fellow named Alonso
00:06:53.319
church and church designed something called Lambda calculus you might have heard of that Lambda calculus is a way
00:07:00.840
of calculating things it consists of the Lambda character here which introduces a
00:07:06.360
function the after the Lambda character will become a variable name here X after
00:07:12.120
the x is a DOT that separates the variable name from the body of the function and in this case the body of
00:07:17.639
the function is another X so this is in essence the identity function you pass this function in any value and it would
00:07:24.199
return that very same value so this is a very very very simple function in Lambda
00:07:29.280
C calculus this is all there is to Lambda
00:07:35.520
calculus so there are no strings there are no integers there are no floating
00:07:41.479
Point numbers in Lambda calculus there are only functions and the calling of
00:07:46.639
those functions so how do we represent interesting things in it well here's one particular scheme you might use to
00:07:52.800
represent some numbers let's look at the number one over there and you see it's a Lambda function that takes an argument F
00:07:59.199
and it returns a Lambda function that takes an argument X in essence you can think of this kind of in my mind I kind
00:08:05.960
of grouped this together as a single function taking two arguments and it takes the function f
00:08:12.360
and applies it it calls it on given the uh the value X so it calls the function
00:08:17.400
f uh passing in an x one time and that represents the number one the number two
00:08:23.879
is represented by a two argument function that takes a takes a function f and a value X and it calls the function
00:08:30.000
f twice on the function f so that represents a two zero calls the function
00:08:36.279
f0o times on the value and this is how we represent number we can represent any
00:08:41.560
number you want to just by applying the function f multiple times to that value
00:08:47.880
and this is this is called a church numeral this is how you represent numbers in Lambda calculus how might you
00:08:53.880
represent true and false at the bottom of the screen you can see true is a function of two variables and true
00:09:00.680
Returns the first variable false is also a function of two variables it Returns the second argument to that so true is
00:09:08.480
kind of like an if then else that Returns the then part and the false is an if then else that Returns the false
00:09:13.839
part so that's how you represent true and false in Lambda calculus so keys to Lambda calculus are
00:09:21.240
that functions are the only data type no integers no strings no floating point no arrays nothing else but functions Lambda
00:09:29.480
binding is the only way you can associate a value with a variable name
00:09:35.760
and you do Lambda binding by calling the function on a value and that will bind the value to the variable in the the
00:09:43.160
argument in the function and then you expand the body of the function using that substitution and all calculation in
00:09:49.440
Lambda calculus happens by a process called beta reduction there's also
00:09:54.680
another process called Alpha reduction but that's not the important one is the beta reduction
00:10:00.160
let's give you an example of beta reduction here we have the Lambda calculus representations of true uh and
00:10:08.240
it's being called on the number one and then the result of that is being passed in the number zero and we're passing in
00:10:14.440
the the Lambda calculus versions so to do a beta reduction you take the function in question you identify the
00:10:21.240
variable X and you expand the the body of the function out and then you replace
00:10:27.120
anywhere in the body where there is an X a variable you replace that with the argument so the argument one gets
00:10:33.200
substituted in for the X and this is the final result of that beta expansion let's do the one more beta expansion
00:10:39.839
expanding the result of this with the argument zero so we identify the the argument of the function is
00:10:46.360
y uh we expand the body of the function out and we brought it down below that's the red circle there and then we
00:10:53.600
substitute in zero for the value y in the body and happens to be no value y in
00:10:59.480
the body so we're done that's it so if you pass true a one and a zero it's
00:11:04.639
going to return a one that makes sense true is a function that Returns the first
00:11:10.839
argument so that's beta reduction in Lambda calculus and you can see that Lambda calculus is nothing more than
00:11:17.800
macro expansion it's merely text substitution that you take the arguments
00:11:23.399
and you substitute them in and expand them out this is really really simple macro expansion here
00:11:30.000
Church said anything that can be effectively computable can be computed via Lambda calculus and I hope you're
00:11:37.040
kind of scratching your head here because I just said ma Lambda calculus is nothing more than just simple macro
00:11:42.720
expansion but Church actually made his Theses he beat Turing to the punch he said this in uh around
00:11:49.279
1933 uh 1935 he wrote a couple papers on this topic so he beat Turing by a couple
00:11:54.880
years in fact when turing's paper came out and Turing said a turing machine can calculate anything that's effectively
00:12:01.360
computable there was a long debate in the mathematical circles whether the two sets of things that were effectively
00:12:07.600
computable were actually the same thing or not and generally today we agree that
00:12:12.639
Lambda calculus and touring machines and then there was a third thing called General recursion that uh was also done
00:12:19.000
all of those things are touring complete in other words they can all calculate everything that can be calculated can be
00:12:25.720
calculated in any of those three systems Lambda calculus General recursion or
00:12:31.560
uring machines now we see the influences of Lambda calculus today if you look at our
00:12:37.720
language lisp has lambdas all over it closure has an FN Anonymous function
00:12:42.880
that is essentially a Lambda in Ruby we like lambas so much we have several ways of doing
00:12:49.920
them coffee script has a very concise way of representing lambdas and
00:12:55.399
JavaScript itself has uh lambdas as well Anonymous functions are nothing more
00:13:01.639
than lambdas or closures if you went to Pat's talk yesterday on block you learned what a closure was and this is
00:13:07.480
all these things are our our closures or lambdas Okay so let's uh let's put that
00:13:13.839
aside and remember that even though it seems hard for us to believe Lambda calculus is touring complete and there's
00:13:20.959
a problem with that that we'll get to in a little bit but first I want to return to the topic where's where's my where's
00:13:26.320
my volunteer back here oh okay what number did you get after pressing cosine
00:13:31.639
many many times
00:13:37.880
07398 very good now why do you thank you thank you very much I really appreciate
00:13:48.959
that turns out if you take the cosine of zero that's a one if you take the cosine of one that is about 054 blah blah blah
00:13:56.839
and if you keep doing that over and over and over again again and if you do that about 89 or 90 times kind of depends on
00:14:04.360
how precise your floating Point math is you'll start zeroing in on a number that
00:14:10.000
is 73 9851 blah blah blah blah uh this is a
00:14:15.680
fix point of cosine that means if I give that seven or that 73 number to the
00:14:21.399
cosine function in radians it will return that exact same number and we have converged on this so a fix point is
00:14:29.800
nothing more than a value you give to a function and that that function Returns the same value then that is a fixed
00:14:36.000
Point turns out lots of functions have fixed points um okay here's the formal
00:14:41.199
definition of fixed point so you can see it give X to a function it returns x x will be a fixed point of that
00:14:48.000
function um for cosine cosine is a fun fix point because you get to zero in on
00:14:53.440
it just by hitting that cosine button a lot but other functions have actually much simpler fixed points for example
00:14:59.279
sign has a fixed point of zero cosine of 0 is zero so uh very simple a square
00:15:05.320
root turns out it has two fix points square root of 0 is zero and the square root of one is one so both zero and one
00:15:12.519
will be fixed points of the function square root uh if you had an identity function any value you gave to an
00:15:19.000
identity function would be a fix point for the identity function so there's an infinite number of fixed points for that
00:15:24.759
function and there's certainly some functions that have no fixed points at all
00:15:30.959
okay so keep the idea of a fix point in your head we're going to return to that and um we're going to switch now to the
00:15:38.800
live coding part of the demonstration okay very good um I'm
00:15:45.680
going to um you guys are familiar with the stabby procs in Ruby 1 n right and
00:15:52.800
you call them by saying dot parenthesis that's at least one way to call them I'm going to put this into my codebase right
00:15:59.160
here so I can just type in something and the last thing in this list of Expressions when I evaluate the buffer
00:16:04.399
we'll just return um the the last thing in that in that expression right there so that's just going to go going to be a
00:16:10.360
convenient way for me to evaluate a buffer and print things out so let's
00:16:15.519
talk about the problem with Lambda calculus and I'm going to demonstrate it by writing a factorial
00:16:22.360
function and we'll use the stabby proc notation so factorial is a function of
00:16:27.680
one argument if that argument is zero then the answer uh the result of factorial on zero should be one if it's
00:16:35.440
not zero the result of factorial should be n * the factorial of n minus one and
00:16:42.160
I can prove that works by calling factorial on five and we know the factorial of five is 120 very good and indeed that's what
00:16:51.199
we get all right but in Lambda calculus I'm not really allowed to assign things
00:16:57.839
like this I have to create a Lambda expression and then call it directly so I'm going to try to call my factorial
00:17:04.520
problem like this and it says oh I don't have a method called fact
00:17:12.079
defined anywhere and that's because I'm using fact in the definition of the function I'm defining
00:17:20.039
well in Lambda calculus all functions are Anonymous so how in the world can I
00:17:25.600
define that factorial without referring to itself and that's that's a problem so
00:17:31.280
how can Lambda calculus possibly be touring complete I'm going to leave that
00:17:37.280
comment up there and just think about that for a while and there's about four
00:17:42.679
topics three or four topics I want to cover first before we go back and solve this problem the first thing we've
00:17:49.120
already talked about the first thing is fix points so you guys know what a fix point is it's a value who given to a
00:17:55.799
function the function returns that same value again the next next thing I want to do is talk about higher order
00:18:05.080
functions now how how many people here are familiar with higher order functions yeah I suspect most of us are the rest
00:18:11.840
of us who are not probably use them all the time and don't really realize that in Ruby um each you know in its own
00:18:18.000
sense is a higher order function what a let's talk about regular functions first I'm going to write a function called add
00:18:23.799
one that takes a number and adds one to it and I can demonstrate that add one works by giving a 10 and I should get an
00:18:30.960
11 out of that so add one is a very simple
00:18:36.080
function um let's do another simple function just for demo purposes it's going to be multiply by three we'll take
00:18:42.320
n and multiply it by three and we'll multiply by three uh the result of
00:18:48.080
adding one to 10 and we should get 33 out of that so add one and mole three
00:18:53.320
are both basic simple functions they take a value they return a value let's write a higher order fun function
00:19:01.640
now
00:19:08.000
oops make Adder is a function that takes an argument X and what it's going to
00:19:14.360
return is another function that takes an argument n that adds X and N
00:19:23.600
together so make Adder is a function that creates a new function given the X
00:19:29.880
it will take that X and create a new function that will add X to anything that you give to the new X so I could
00:19:35.960
actually have Rewritten add one like this make
00:19:41.200
Adder one and that's a decent definition of
00:19:46.600
add one and we can evaluate our buffer and see we get the 33 back so add one obviously must be working so make Adder
00:19:54.760
is a higher order function it's higher order because it's return value is a function itself that's all it means to
00:20:01.120
be a higher order function it returns functional values uh here's another higher order function we'll call it
00:20:08.400
compose compose takes functions as arguments uh two functions in this case
00:20:14.000
f and g it's also going to return a function it's going to compose F on top
00:20:21.120
of the of G called on n and return a a new function that calls
00:20:27.360
f and g on top of each other we can now create a function called mole 3 add1
00:20:32.400
that is the composition of mole 3 and add
00:20:37.720
one then we can use that newly composed method right here and we see we get the
00:20:43.679
same answer 33 back compose is a higher order function it's higher order for two
00:20:49.799
reasons it returns a function but it also takes functions as arguments and that's enough to make it a higher order
00:20:55.240
function as well we are going to use higher order functions a lot in the rest
00:21:01.720
of this demo so if you've never seen them before just get comfortable with the fact that functions are values and
00:21:07.520
we can manipulate them just like we can manipulate strings and integers and whatnot all right higher order
00:21:14.159
functions next topic will be uh
00:21:19.279
functional
00:21:26.200
refactorings and just like in the object-oriented World there are refactorings that we can do like extract
00:21:33.559
method or collapse hierarchy there are there are a number of refactorings that you can do in the functional world that
00:21:40.400
as well and they are purely functional in flavor the first one I want to talk about is the tenant correspondence
00:21:48.159
principle spondence
00:21:53.880
principle okay tenant says if I have a expression X and I surrounded by a
00:21:59.679
Lambda and then immediately call that Lambda I get X back it essentially has
00:22:05.559
no effect it's a true refactoring that it does not affect the code so let's demonstrate that let's go down here to
00:22:12.080
mole 3 and inside mole 3 inside mole 3 I'm going to wrap this
00:22:19.880
expression inside a Lambda and then immediately call the Lambda that's the tenant principle and
00:22:26.400
we see that it has no effect on the outcome we still get 33 it is a true refactoring let's do it one more time
00:22:33.600
I'm going to wrap this whole functional expression
00:22:39.440
in a tenant Lambda and this time I'm going to do it with an editor macro just to show that these things truly
00:22:46.360
are mechanical there my editor did it for me
00:22:51.960
and yes we get 33 out of it so the tenant principle very useful in functional refactoring our next
00:22:58.960
refactoring will be to introduce a binding so if I have a Lambda here with
00:23:06.919
no arguments I can add a new argument binding to it uh we'll call it uh
00:23:13.559
Z and then if I add an argument here when I call it I have to give it a value and I can give it any value that I feel
00:23:20.520
like because Z is not used anywhere it doesn't matter what value it takes so I
00:23:26.840
can introduce the argument Z and evaluate that and it's still 33 no
00:23:32.400
effect now there's one rule you have to follow and let's uh demonstrate that
00:23:38.960
here um you have to make sure that anything
00:23:45.600
any variable you add in that you introduce is not free in the expression you're wrapping so down in here we got
00:23:53.480
variable n n is not free because it's bound right here so I could actually reuse n it'd be confusing but I could
00:23:59.760
but f and g are both free variables in this expression they're defined
00:24:05.240
outside so if I were to take F in here and give it an arbitrary value it would
00:24:12.000
die um whereas if I called it n that would work that would work uh generally
00:24:19.159
I'm going to avoid using any used variable at all so uh XY ZZ Y is a good
00:24:24.799
choice there and that works but as long as you stay away from free variables in the expression you're wrapping it
00:24:30.799
doesn't matter what the variable name is uh number three will be a wrap
00:24:37.640
function if I have an expression that is an function of one argument I can wrap
00:24:43.600
that in a Lambda with of one argument and then call the function inside the lamb inside that Lambda and pass the
00:24:50.679
argument down to the call site let's demonstrate here is a function of one
00:24:55.960
argument I can wrap it in a brand new Lambda with an argument of
00:25:01.200
V there it's wrapped then on the inner function that I just wrapped I need to call it immediately and pass V in this
00:25:09.880
essentially what I'm doing is make add now returns a Lambda that has the same
00:25:15.360
signature as the function it was previously returning but it doesn't actually evaluate the the Lambda on the
00:25:21.840
inside until you call it again so it's kind of like a delaying tactic if you don't want to evaluate something right
00:25:27.159
away this is something you can do to do that and you can tell I still get 33 out of the result so function wrapping is a
00:25:34.000
true refactoring one final one you love this
00:25:39.120
one this is inline definition I can take any definition here in this case the
00:25:46.039
definition of make adder and wherever make Adder is called by name I can
00:25:51.440
replace the name of make adder with the
00:25:56.919
definition and let's delete this line here and when I evaluate that same
00:26:04.200
result again this is totally mechanical I can take compose and run the inline
00:26:10.360
refactoring on that say yes replace that I get the same result back oh let's
00:26:17.919
let's go wild with this one in line add one uh no that's part of
00:26:24.399
the name but here's add one yes uh let's inline mole 3
00:26:31.600
there okay one more time
00:26:46.279
oops uh I did promise you the worst Ruby code you'd ever see
00:26:51.520
right that mess let's evaluate it still returns 33
00:27:03.200
now the really interesting thing about this this is a pure Lambda expression there are no assignments anywhere in
00:27:10.360
this the only binding that we do to variable names is through calling these Lambda functions and um we do a lot of
00:27:17.960
calling the Lambda functions and that's that's really ugly and very unreadable and I don't recommend writing code like this but it proves a very important
00:27:24.320
principle that Lambda calculus you know this this this is the goal of what we want to get to in Lambda calculus is to
00:27:30.520
have one big Lambda expression that does this calculation okay enough of the
00:27:39.960
preliminaries seems a shame to write all that just delete it doesn't it let's get back to the problem of
00:27:47.080
recursion and let me grab my factorial thing and let's paste it back here and
00:27:53.080
uncomment so this is what we're dealing with I want to write a factorial function that's recursive
00:27:59.320
and I can't because I cannot refer to factorial inside the definition of factorial because there there are no
00:28:05.960
names for these functions well I could actually there are names they're introduced by Lambda binding so I
00:28:12.440
actually could do this I could say let's create a function that has factorial as an
00:28:17.880
argument and then now factorial will be bound to this and um and let's call this
00:28:26.840
let's call this make factorial factorial okay and then all I have to do to create the factorial function is to call make
00:28:34.840
factorial and all I need to do is pass in the definition of factorial I think about that for just a
00:28:43.480
sec okay I haven't solved the problem yet I've just made it a little I i' I've just introduced some IND Direction
00:28:50.519
there okay so that's not going to work CU to to make a factorial I need the definition of factorial little bit
00:28:56.640
circular logic there I need to do something else well okay maybe maybe I can uh relax the requirements a little
00:29:03.039
bit maybe I don't need a make factorial maybe I need a function that is a factorial improver and instead of taking
00:29:10.440
the definition of factorial as an argument it will take a partial definition of factorial as an argument
00:29:17.480
and by partial I mean a function that acts like factorial over a subset of the
00:29:23.720
possible inputs in other words if I have a factorial function that works from 0
00:29:29.760
to 10 it can calculate the factorial of 0 1 2 3 on up to the factorial of 10 if
00:29:35.320
I pass that partial definition of factorial into my fact improver I will
00:29:40.399
get out of that a new function that works for factorial from 0 to 11 it
00:29:47.039
improves any factorial definition by one step that's kind of
00:29:54.039
cool so uh I need I need I need an error function here so bear with me while I
00:29:59.880
write this okay this is just a function of one
00:30:07.799
argument that throws an error so we can see what's happening now if I factorial
00:30:16.200
improve factorial improver and I call that on error I claim I will get out of
00:30:23.799
this a function I'll call F0 because F0 correctly calculates the factorial of
00:30:29.679
zero
00:30:36.240
oops oh I left this five on there that five should not be in
00:30:41.799
there yes a correctly calculates the factorial of zero all right
00:30:49.840
that's interesting oh but if I can calculate the factorial of zero I can create a
00:30:55.480
function that calculates the factorial of one
00:31:03.440
yeah but F but F1 does not calculate the factorial of two oh but if I got one that calculates
00:31:10.559
one I can write an F2 based on
00:31:16.880
F1 that works for two does it work for three no not at all
00:31:24.320
okay you can you can kind of see where I'm going with this maybe uh let's before I go any further let's inline
00:31:29.440
some of these things so I'm going to do the uh inline refactoring there and
00:31:35.519
there okay and let's call this FX right now not tie it to a particular number
00:31:41.240
and then I'm going to take this from here to here I'm going to cut it and
00:31:49.080
paste it one two three four five times 1 2 3 4 five to balance the
00:31:56.519
parentheses and now FX calculates the factorial of five it'll do 10 it'll do
00:32:05.600
14 but it will not do 15 well I'm getting closer to my
00:32:15.919
goal all I have to do is decide the biggest number I will ever want the factorial
00:32:24.399
of and call fact improver on it that many times okay maybe that's not quite so practical
00:32:30.200
that's not going to get us to the real factorial function okay let's get rid of that let's think about this well first
00:32:37.120
before I go any further let's go ahead and do the whole embed this in a Lambda trick so I'm going to take create a
00:32:42.399
Lambda uh function that expects a factorial improver as an argument and
00:32:49.279
here's the body of that function oops it should be open
00:32:56.240
parenthesis and close parentheses so here here is the body of the
00:33:01.679
function here is where I'm passing in the argument I'm passing in the factorial improver let's just call it
00:33:07.799
improver for right now and then in here I can I can do the same tricks I was
00:33:13.240
doing elsewhere I can say improver do improver do
00:33:18.360
error and if I put this into FX and FX should work for one but it won't work
00:33:25.600
for two nope okay so so this is exactly the same
00:33:30.799
thing I did before except I've embedded it in a Lambda and binding the improver instead of assigning to improver I'm
00:33:37.639
binding to it with Lambda binding um which is what we want to get to
00:33:42.880
anyways improver is interesting improver says I can take any function and prove it by one step I was using this error
00:33:50.679
thing which obviously has nothing to do with factorial at all but I'm wondering if I replaced
00:33:58.639
error with just improver I should get a function that
00:34:05.399
works for factorial of zero it
00:34:11.679
does but if I call factorial of one it should fail in a weird way and the weird
00:34:16.720
way is that it says that proc can't be coerced to a fixed num and the error is happens right here on this line right at
00:34:24.560
this multiply where it's taking n * that and this this partial now is the
00:34:31.320
improver function and improvers expect functions not numbers so of course is not working at all in fact we can make
00:34:38.359
this very much pler if I replace the word partial with improver here since
00:34:45.359
I'm oops sorry about that back down here since I'm passing in
00:34:50.720
improver to improver and this is improver so let's call it improver as
00:34:57.160
the argument I'm actually doing that and and this becomes very plain that improver does not take a numeric argument improver
00:35:03.560
expects a function oh I've got a function laying
00:35:09.320
around let's pass it
00:35:15.920
improver gosh I wonder if this could possibly work well factorial of one is
00:35:22.359
good how about factorial of two oh how about five
00:35:28.680
oh how about 50 about
00:35:34.839
500 I think I go up to 2,000 before I overflow my stack yeah so there
00:35:40.440
factorial of 2,000
00:35:46.000
yep and here you go here is uhu let's let's make this all the way pure Lambda
00:35:51.359
calculus by defining the function and then calling it with the argument five at the end here and we will have a pure
00:35:57.319
lamp to expression that calculates
00:36:11.000
factorial now this is good enough for our goal of proving that Lambda calculus
00:36:17.200
can actually calculate arbitrary mathematical functions uh recursion is
00:36:22.760
no problem for it by doing this little trick of passing these things to it self
00:36:29.040
um it actually will will actually can do recursion with Lambda calculus even
00:36:34.560
though we're Lambda calculus is nothing but Anonymous functions this is mind-blowing to me
00:36:40.440
actually but I'm still not quite happy with this um definition here um there's
00:36:48.359
two things I don't like about it first of all I'm calling this thing improver and it's no longer improver improver
00:36:55.560
took a partial definition this this function is taking something that is not
00:37:01.240
a partial definition so this thing here should not be called improver at all should be called something else so let's
00:37:07.599
replace the name improver with a different name and here I get stuck because I have no common name for the
00:37:15.000
thing that this this argument now is it's no longer an improver function I'm
00:37:20.760
going to go with the name gen because it's a generator that
00:37:25.960
somehow when you feed it it's self when it swallows itself it generates a
00:37:31.520
recursive function um if we can live with that that's what I'm going to call it for the purposes of this
00:37:37.599
demonstration and I can see that I haven't changed anything we still have a working factorial system there so that's
00:37:43.680
the number one thing I didn't like the second thing is that if I were to sit down and write a recursive function it
00:37:51.040
would never in my entire life occur to me to write it in this manner there's
00:37:56.240
lots of mechanics right here involved in just to
00:38:02.680
do the recursion and I want to be if I could get this in a form where I was writing factorial as an improver
00:38:09.599
function improver function makes a lot of sense to me that seems more or less natural at least in the Lambda calculus
00:38:15.359
world if I could get back to the the improver form of this I would I would be really glad so let's see if we can work
00:38:22.119
on this code base just a little bit I'm going to take this inner Lambda right here and I'm going to do the tenant
00:38:29.920
correspondence principle refactoring on it there we go let's break that up just
00:38:36.400
a little bit here so I wrapped this in a Lambda and immediately called the Lambda
00:38:43.480
and it has no effect it does the same thing so we still have a working factorial didn't break anything now I'm
00:38:49.040
going to introduce a binding and uh the variable name I'm
00:38:55.240
going to use is code that's the name of the variable and the value I'm going to pass in will be my error function
00:39:02.000
because remember when you introduce a uh a variable introduce a binding the value doesn't matter it can be anything so
00:39:08.920
there I'm g a function with uh that takes an argument of code and I'm passing in the error function and we're
00:39:15.040
not using Code anywhere so we never trigger the error and it still returns 120 for the factorial 5 okay H well I
00:39:23.640
can pass anything in let's pass in gen hey that still works too oh this you
00:39:29.400
know what this whole gen of gen thing is actually is a function let's call gen of
00:39:36.000
gen and uh oops stack too deep this is what has
00:39:43.280
happened I'm calling gen of gen inside of here and so this entire thing let me
00:39:51.440
Mark this this entire thing right here is the Gen function so when I call gen
00:39:56.839
and pass in gen um it goes and it would call gen of
00:40:02.599
gen inside of here but it only called it if n was not zero so it actually
00:40:10.119
bottomed out it would actually terminate here I'm passing in gen of gen and it always calls gen of gen even when n is
00:40:18.000
zero so it has infinite recursive depth in there and that's no good oh if there
00:40:23.960
only some way of delaying evaluation let's wrap it in a
00:40:31.359
function here's my wrap function argument name will be V so there I have
00:40:37.359
taken gen of gen and replaced it with a Lambda so when I call it it doesn't
00:40:42.920
infinitely recurse it just passes the Lambda in in its place that is functionally equivalent to calling gen
00:40:49.280
of gen and we see this actually works okay next step I got code code is
00:40:58.520
bound to that Lambda that evaluates the Gen of gen and we know that here let's let's go ahead let's go
00:41:05.480
ahead and wrap this as well I see right here from here to here
00:41:13.119
this is the exact same thing that I'm passing in to code so let's just call code
00:41:21.960
instead that works
00:41:27.640
that's interesting let's rename code to partial and all of a sudden out of
00:41:34.280
nowhere that little piece of code right there is our improver function and we've got it
00:41:40.040
back cool but it's still buried in the middle of all this recursive stuff let's see if
00:41:45.839
we can pull it out so out here from here down to here not including the argument
00:41:51.359
call but the entire body of the function I'm going to do a tenant refactoring again that wraps it in a Lambda
00:41:58.520
Let's uh put some line feeds in and pretty up the indentation a little bit
00:42:04.160
there there we go and I'm going to introduce a new binding here again introduce binding and
00:42:12.680
The Binding will be uh code the value will be error again because that worked so well for us last
00:42:18.599
time and that still works that's still 120 okay now I'm going to point out that
00:42:25.079
here is the improver I'm going to copy the improver from here and call
00:42:36.319
oops pass in the improver as the value of
00:42:44.040
code and that still works because we're not using Code but I point out that code is now the same thing as the improver
00:42:50.800
function embedded in here I can replace this now with code and it still is 120
00:42:58.440
and now let's rename code to be
00:43:03.800
improver oops rename code to be
00:43:10.640
mover I I was practicing this this morning I was typing improver so often all of a sudden I started reading
00:43:16.680
reading it as imp Rover and yeah replace those two with
00:43:24.400
improver and we still got 120 out of that that and there we have we have a
00:43:30.720
complete Lambda expression here this part of the code right here deals with the recursive
00:43:37.400
nature of the problem this piece right here is the only piece that knows
00:43:42.520
anything about the definition of factorial so this is really nice this is exactly what I want and now now that I
00:43:49.160
got a complete Lambda expression that does this I'm going to pull the pieces out and name them so we can talk about
00:43:55.040
them reasonably so let's pull this out I'm going to call this uh fact improver
00:44:01.520
again impr Rover and let's uh put it up
00:44:09.839
here fact improver is that nice
00:44:16.040
indentation uh still works no problem let's take out this piece right here I'm
00:44:22.119
going to call this oops I'm going to call this piece y
00:44:30.280
get some nice indentation going on on that oh it's already okay very good and
00:44:35.599
then I'm going to take this out here and I'm going to call this piece fact or
00:44:41.760
factorial and then we'll call factorial right there everything still works and
00:44:47.400
I'm going to clean this code base up just a little bit here streamline it just a wee
00:44:53.200
bit okay and so now I got three pieces let's start with the bottom and work our
00:44:59.079
way up if I so here's a this is this is
00:45:04.119
factorial this is the real factorial function and I prove it by oops prove it by calling factorial 5 and it actually
00:45:10.119
does calculate the factorial 5 if I were to take the factorial function and run
00:45:15.440
it through the fact
00:45:21.280
improver what will come out of that
00:45:27.200
well fact improver takes a partial function and improves it by one step but if I've already got the real factorial
00:45:32.920
function all improver can do is return the factorial
00:45:38.440
function this is indeed the factorial function and I can
00:45:44.319
run this I still get 120 so we're using this definition of factorial now not
00:45:50.359
this one and so that proves that fact improver returns its argument factorial
00:45:57.400
let's write this down factorial is the fix point of fact
00:46:08.359
improver I'm going to say improver now in my head every time I do that factorial is the fix point of fact
00:46:16.400
improver higher order functions have fixed points as well their fix points happen to be functions but that doesn't
00:46:23.319
change the fact that they're fixed points well that's interesting now I'd
00:46:28.720
like to point out that the function y calculates
00:46:35.520
calculates the fix point of an improver
00:46:42.040
function in this case we're calculating the fix point of the fact improver but I could write any recursive function I
00:46:48.359
wanted to and it would calculate the fixed point of that recursive function as long as it's done in an in an
00:46:55.280
improver style I WR could write a Fibonacci improver and that would it
00:47:01.079
would work perfectly well so why calculates that so let's talk about why so why here is actually the Y combinator
00:47:09.280
that we've all heard about and loved well maybe not loved in particular this there's
00:47:16.640
actually many many many versions of the Y combinator and if you go to Wikipedia you will not see this version at all
00:47:24.280
um first of all mathematicians don't like intention revealing
00:47:31.480
names so they call improver F oops too far and they call the generator function
00:47:40.520
I've seen it called G I've also seen it called x uh the Wikipedia article uses X
00:47:45.880
so we will use that here and uh so this is probably more likely the form of the
00:47:51.880
white combinator you'll see with with variable names F and X in it but we haven't changed anything we we're still
00:47:57.680
calculating factorial uh the other issue here is this in particular is the
00:48:04.640
applicative applicative order why combinator uh Ruby
00:48:11.640
is an applicative language in other words it evaluates its arguments before it passes the arguments down into
00:48:17.079
functions uh Ruby is applicative python is applicative um lisp and closure are
00:48:23.960
all applicative languages examples of languages that are not applicative would be hascal hascal is lazy it will
00:48:30.640
evaluate its arguments only at the point it really needs them so there is a version of the Y combinator for normal
00:48:37.440
order languages like hcal and the difference is that you don't have to introduce this wrap function thing right
00:48:45.119
here that we had to do to prevent our infinite stack recursion so that would be the normal order um Y combinator and
00:48:53.440
when you talk about y combinators normally they mean the normal order in the mathematical sense in a language
00:48:59.319
like Ruby we need to use the applicative order version instead the applicative order has another name it's called the Z
00:49:06.359
combinator why these things have so many names I have no idea it's also known as the fixo
00:49:13.640
combinator because it is the combinator that calculates fixed points of functions so there you have it um oh oh
00:49:22.440
oh one more thing one more thing if you again if you go to the Wikipedia page you will not find it ex in exactly this
00:49:28.000
format there's a preferred style they like to write it and uh I'll show you that through two refactorings here uh
00:49:35.040
remember f is really the name of the improver function so if I call F on this x ofx thing uh I'm just improving the
00:49:42.760
factorial function and that is a noop in this term so this still returns 120 no
00:49:48.160
change and then I can also do a function wrap here
00:49:56.960
and we know function wrap doesn't change anything and we can see it doesn't and then if I take off two levels of
00:50:02.799
indentation all of a sudden you see a great symmetry between the body of the function and the argument you're passing
00:50:09.400
in and this is the version you'll probably see if you look it up in Wikipedia or some some math on
00:50:15.040
computational science so there you have it there is the Y combinator derived from first principles
00:50:34.040
wow my brain is dead after that one I started on this journey because I always heard about the Y combinator and
00:50:40.720
I really really really wanted to understand it now people ask oh that's cool how can I use the Y combinator in
00:50:47.160
my own code well if you're if you have named
00:50:52.280
functions uh you don't need the Y combinator if you want to write all your code as Anonymous functions then maybe
00:50:58.040
the Y combinator would be useful but I don't really recommend that what I found interesting though is that by starting
00:51:04.680
with some very basic principles and applying these refactorings I found the the functional refactorings to be very
00:51:10.799
fascinating and by applying that you could take something in one shape and transform it and get it into a different
00:51:17.079
shape that's more appropriate for the needs that you have I've really enjoyed uh playing around with those uh those
00:51:23.200
refactorings in in in that exercise so now you can go home and you can explain
00:51:29.040
to uh your all your co-workers that you have the white you've seen the Y combinator you understand how it's
00:51:36.240
done I think the white commentator is one of those examples of great Beauty in mathematics it's a really interesting
00:51:43.480
function that is theoretical in a lot of respects but comes out of something like
00:51:49.040
Lambda calculus which seems trivial uh a trivial macro replacement type of
00:51:54.400
language but yet Lambda is buried so deep into Ruby we use it all the time
00:52:00.760
without even realizing it it's it's it's one of those things that have a great and profound impact that's one of the
00:52:06.599
things I really enjoyed uh researching this particular topic and I hope you guys enjoyed it as well if you liked
00:52:13.720
this I'm going to recommend a talk called programming with nothing by Tom Stewart Tom uses nothing but lambas uh
00:52:22.160
the stabby procs and calling thereof and writes the phys Buzz application he uses
00:52:28.240
no integers no strings no logic of anything except calling Lambda functions
00:52:34.599
it's a true example of why of Lambda calculus done in Ruby and uh that's a
00:52:40.040
fun just Google programming with nothing and you'll turn up that video I recommend that one so thank you very
00:52:45.400
much you guys have a good day