How Complex is "Personal Computing"? (2009)
From Viewpoints Intelligent Archive
so I'll just put thing you guys whoop I should
get your signature back
I was just gonna write the name of I'll put
it over here so the the basic
apologize
in advance for a couple of things
one
is one is the
brevity and the
kind of livid is a navy term for five pounds of
shit in a one pound bag and so
we have a lot more information
that we'd like to talk to you
about but we haven't ever been
able to fit it into an hour and
so we're gonna try to give
you the gist of some of the things that we're
doing but this website
Vpr i.org
maybe our talks are kind of a commercial
to just have you look
at some of the stuff there so I'm just gonna spend a few minutes giving
you the context
for a project that
the NSF reviewers turned the
preposterous proposal when they turned it down
anybody ever been turned down by an
NSF peer review now
it turns out that the program managers at NSF
have always had the power to override those and in
this case the program manager
did override it so we got our money anyway
but they only gave us money to
cover about half the expenses of this project so we
raised a matching amount of money
to do it it's a project that a
number of us have been thinking about for as
long as 30 years that would be really interesting
to do and
it would be very much in the original spirit of
computer science was supposed to be as articulated
in the 60s by Al so
that kind of starts
quintessential Maxwell's equations t-shirt
and
this is one that everybody's seen
and if
you're familiar with Maxwell you know that he actually
wrote five papers arriving
at these conclusions and in five different ways he
did this over a period of years using different
models and the original form of
maskull's equations was not fiddle on a t-shirt
so this form was actually
made by several people including the
Germans who were much more interested in it than the
British in
no small part because Maxwell's
equations violated Newton and Newton was
a god Maxwell was only their best guy in the
19th century but Newton was
a god and of course the interesting
thing about getting him in this form is
although Faraday's
experiments showed that there should
be a symmetric relationship between the electric
and magnetic fields the equation didn't show
those relations as symmetric
and this asymmetry and this way of
things led to Einstein's theory of relativity and in
fact a much simpler form of writing these
guys down and reason
I'm putting it up there is I had a similar
experience when I was in graduate school
trying to understand programming languages and I looked at
the bottom of page 13 of the list 1.5
manual and here was McCarthy's
formulation of Lisp using itself as a
metal language and how
many people have had the experience of going
through this it takes a couple of hours on a Sunday afternoon
to do and you run out of fingers because
you have your fingers down in all the places that recursos
and reenters and so forth and
I felt I understand stood it when I found in this
code there is one bug and
the thing I got out of
it is boy after those two hours I understood
what programming languages were about a factor of a
hundred better than I did before I went in because
McCarthy had carved away all the bullshit and
God that's something that was
more powerful than any other programming language of the day and more
powerful expressively the most programming languages today
and had
done it in this interesting
form and in
fact it had a lot of effect on subsequent
things that were done and because I had a math
degree I just love this stuff and
so we
tried these ideas a number of times on various
of the stuff we did it Xerox PARC
that
gee you can do
programming language in under a thousand lines of code or
you might be able to do it in the page and there
are advantages to the page for
sure so
so
the thing we were speculating about is
do you it'd be really interesting if
you could apply this way of looking at
processes to an entire system that's well
understood like personal computing so
we have now this interesting idea that if
science is the study of and modeling of
phenomena in order to understand it better we actually build
artifacts that exhibit interesting phenomena
and what John did is
to take the phenomena exhibited by programming languages
of the day and find an abstraction that
abstraction happened to be more powerful than
the ones he was looking at which is kind of an old mathematicians
trick you know solved the more general problem
if you can and so
the idea here was be
interesting to take this body of behavior
from the end-user to the
metal of the Machine and
try to understand it by right
making a model that you could actually run it would generate
the same phenomena and so
how many t-shirts would it take to do all
of personal computing that's the question here you
can see why they called it preposterous right
because Microsoft's version of this is a few
hundred million lines of code their
operating system and the same
Microsoft Office is about 220 million lines
of code but if you think about
it with the view of well what is it doing really
and how many things in it that
decided to do differently for example are
actually the same thing masquerading under a few
simple parameterizations and and
also the first time around that this stuff was done
it was only ten thousand lines of code it at
Xerox PARC and there's not a lot more
functionality now
than there was back then so the idea is this could
be a lot smaller and be very
interesting to see for the same reasons that these
two guys are interesting but this would be look
what our project is
and there's one more tea involved here
which is we one
of the things that's nice about making
things small and simple is you got something like
a Model T Ford which any 12 year old
could take apart in the barn over
the weekend and put together only had about 350 major
parts in it and about a thousand
screws and nuts and bolts and stuff so anybody who
grew up in a farm like I did
where there's still Model T trucks and
in the 40 they would be routinely taken apart they
recently were still around this you couldn't couldn't really kill
them because everything
that broke on them was fixable in the blacksmithing shop
that every farm had back then there was nothing true they didn't
even have a distributor they didn't even have gears
they had belts
and pulleys on them so
but they're a real automobiles so they had this interesting
thing they caused even though their automobiles
before them they caused the automobile revolution
in America because they were so easy to manufacture and
they lasted well and so forth so just
one bottle T story
the
gap the spark gap on
Model T was
optimized to be the width of a dime
and somebody asked Henry Ford once why did
you do that and he said well people always lose
gauges but they usually have a dime on them so
he's fully expected anybody who's had one of these things
would be able to just fix it and tweak it up and that's
what happened so it's a nice metaphor
so
my theory is before you write the Ferrari code you should write
model-t code because there's
a 12 year old kid who just isn't gonna be able to understand
the Ferrari code and you have a moral obligation to
write that Model T code that is understandable
so
kind of the standard model for personal computing
is from the Xerox
PARC
experience and so
the bitmap screen that can display anything and a
way of organizing views on the display
and people have been confused
by the difference between desktop publishing
and Microsoft for instance has been
and window display they're actually
the same thing just these guys have
boundaries around the views and the
publishing you don't have boundaries around the views but
aside from that everything else is exactly the same and
it was done with the same code so
this is a modern version of it
and I'm disappointed
in this display because Danny
molang has done a lot of work to
do really fantastic graphics
which some of the
things are shown here and you can't see how crisp
and beautiful it is but this is kind of a modern
version of it done with some of the stuff that that
that we've
done and just to show you a little slant of it
here
so I I can
edit this as I'm as I'm
and
but I can this
completely differently than the way you might think so
for instance the the code behind
these paragraphs is the justification
and the editor for
come to about 23 lines of
code in one of the languages we developed to
do this and I'm not gonna explain exactly
how we did this but I'll just give you a little flavor of
what's actually going on here so
you can think of this as swarm programming
so what we have is thousands and thousands
of parallel processes running
at the same time and these are all objects
and
so the slowest simplest
way you can think of doing
the
problem here is well
just follow the guy in front of
you if you don't have anybody in front of you you know where to go if
you're over the margin tell the guy in front of you and if somebody
tells you they're on the margin and you're the first there's
a blank in front of you then go to the next line all right so that's
four things that do a text
justification Oulu
sore anything they're just all running and
they're related to each other so we call this particles
and fields type programming and there's a lot of
it in this in the system and
you notice that I've embedded a view of the
entire document recursively into it so
that was operating at the same time so this is basically
this kind of stuff in which each part
of the thing has been mathematize dand done in a separate
programming language and of course we have to make the programming
to do it we have to make the tools to make the programming
languages to do it and
but
what we don't try to do so these are the three main
operating systems and won't
tell you which one the lemon is but
they're all big and we
want to avoid apples and oranges comparison
so
the stuff that we're doing is is not
there's nothing that we can do that can
be honestly compared to
what Microsoft or Apple do because they
have legacy problems we don't have there's
a million things so we don't even worry about that but
here's a t-shirt with a mustard seed on it and
that gives you kind of what we're out after out after
more than a factor of a thousand change
in the size of the code that it takes to make this kind of
phenomena something more like a factor
of 10,000
and
so if you have the
end-user at one end and you have a power supply at the other
you have a lot of different
kinds of phenomena in there and the question is how many
t-shirts do you actually have to do so you think
of the the two and
a half D anti-alias graphics as you'd
love to have that in one or two t-shirts so
that's regularly a couple hundred thousand lines of code and
if you think
of the number of different languages you might want to
invent to mathematize the different parts of these
probably five or ten different languages you have to do
so you have to have a metal language metal language has to deal with itself
and and so forth and
you'd like to improve
semantics of programming and so forth so
we have a dozen or so powerful
principles here are ten of them there's nothing more boring
than somebody enumerated list
on a slide so I'll just point out
that most of these powerful principles are ones that
have been around since I started in computing in the early 60s
and most of them are not in
general use today or in use the way they
were originally used so
some of these are things that have been
resurrected so
the particles infield is something that hasn't hasn't been
used very much but
the notion of simulating time not letting the CPU determine
what time is all about goes back to both
Simula and john
mccarthy situation calculus these are old
ideas but they have the great
advantage that you don't get deadlocks
monitors and semaphores so
the basic idea is if you simulate the time that you're running in
have race conditions that you haven't explicitly
allowed and so on
so the underlying idea here is that math wins
not standard math but math that
you actually invent in order to characterize all
the different behaviors that you
half and with that I'm going to turn it
over to Dan amma lang who
you a little bit about how we do the graphics because
this is kind of a classic case study of
give you this
dan is a graduate student at
UC san diego and
just got his master's for part of this and
is on his way to getting his PhD for the
rest of it
and well what we'd like to do here is
to I don't think anything
I said needs questions except maybe to the end
but be great I brought
the three people who did a lot of work in this system
along there by the time
you get to my age you're just a figurehead some
thrill to actual actually bring
people who are doing stuff here and to have them tell you
little bit about how they're approaching these
problems and hope you do ask questions
hello
so like Alan said my name is Dan am
Liang I'm a grad student like you guys so go
easy on me I'm getting
towards the end of my
times of PhD students so I'm
gonna present to you basically where
I am on my dissertation work I have still about
a year and a little more than that to go
although I've noticed that kind of just extends
forward it's a tradition of PhD
students right so I'm a
UCSD PhD student and I but I'm also an
employee of viewpoints Research Institute and
so my little niche in this project is
the 2d rendering system because we'd like to build
a whole
basically a whole software stack and
we're not allowed to just delegate to the
to reinvent programming at every level so I have some background in
to me was that you can't just keep having ideas and just sort of you know
experiment
have several meta implementations this particular one uses
JavaScript as the target language as the assembly language underneath
and so in this web
browser I can just type in a bunch
of code like well even any JavaScript
expression like that I can evaluate on the fly
but that's not very interesting
right because I as a language guy I want
control over the language that I'm using and so the stuff
you tell it to evaluate is not necessarily the stuff that gets
evaluated there's a level of Indra there's an interaction
here in this function groups
called translate code that
gets called with this string that you want to evaluate and
is supposed to return the the thing that you pass
to the eval function in JavaScript so it
doesn't matter what it says right now but
the cool thing about this is it lets
you replace the programming language that's been that's
being used in this interface very easily and
of course it's very self-serving because
Oh Matta is sort of the ideal language as far as I'm concerned
translate code I'm very
modest so let's
see here I
don't know why I can't see well
oh I
know what's going on hang on one sec technical difficulties
okay so
instead of going over the language and stuff I'm just gonna
say few words about it and then I'm gonna show
you a bunch of examples and end up with Dan's niall
translator so the the
language is that pattern matching is good for most
things that you care about in writing a translator for
instance if all you care about is lexical
you could pattern match on a stream of characters and produce
a stream of tokens similarly to write
can pattern match on the stream of tokens to
produce parse trees and then code generation
at least naive code generation can be done by pattern
matching on a parse tree to output
some code and so on even transformations on
parse trees and optimizations can be done in the same way
and so let's
start off with something
fairly standard so a little
while ago is set down with a friend of mine and I
wrote a library in JavaScript that's about 70 lines of
code that implements the semantics
of Prolog including your unification backtracking all
that stuff and that's what's
being loaded on the top of the page the thing that you
see right here is a grammar
written in omena that teaches
the browser how to understand Prolog syntax it
doesn't include things like the special list
syntax you know in Prolog but that doesn't
matter anyway because it's just syntactic sugar so the
important stuff is there and it's fairly easy to
grasp and gist and so on so I'm
gonna load this stuff and
okay
now
down here I can tell my Prolog
translator to answer certain Prolog
queries given a few
facts and rules just like you would do in a in a real
Prolog interpreter and so in this
I'm asking for all the natural numbers given that
zero is a natural number and the successor of X is a natural number if
X is a natural number and so if
I select this stuff and
tell it to execute it's saying that zero
is a natural number so it's kind of small one is a natural
number two is a natural number and so on forever
until I tell it I'm done
and then this next example should
be very familiar it's
it's
the usual thing that everyone does in Prolog in their
first couple of sessions
right it's a it's an all male in
child world so
I'm asking for all the grandfather and grandchild
relationships given that you know Abe
is Homer's father Homer's Lisa's father homers
Bart's father and homey Homer is Maggie's
father and then there's the rule for
grandfather in there which yeah
it doesn't include women I guess I'm
not PCs so when
I say that you get the right answers it enumerates
all the three relationships that you care about
so quickly
to show one of these before
I move on to the meaty scope
where is this
by the way this is
a wiki also and so anyone who's interested
in this stuff can just open up this webpage on their browser
without downloading anything and see these projects
try them out and make their own project save them and
so on so
ok on this page what
I have it's roughly a page of code here this
is logo syntax and on
the left hand side you see sort of
the BNF style rules to describe the logo
syntax using left recursion and stuff it's
fairly readable and on the right hand side there's some JavaScript
code that's being output by the semantic actions and
and then this is
the neat part after I've defined the language and
there's a little bit of runtime support above the
grammar I can read the find translate code
to be a function that takes in the code that you tell it to
evaluate and then passes to smiley
my turtle the code to evaluate and and
it doesn't return any useful value
undefined by the way I'm returning the undefined string
because one JavaScript tries
to evaluate the undefined string it's not
going to do much so if I select
all this stuff and tell it to evaluate
quickly this added my
code added a you know dynamically added
a canvas elements to this page and then
down here
so it's
also switched the language that's understood by this browser where
there's a work space so I
can teach it to do a spiral which is code that I actually
took from the Wikipedia page on
on logo and then if I
tell it to draw a spiral and I can call
it was too quickly - up
but I can draw
ell hopefully I don't think I'm cheating but
it actually draws in real-time and it looks
really cute so anyway now I'm going to show you some
more interesting examples first
thing I want to show you is that
there's a little JavaScript compiler
eally a compiler that's a bit of a misnomer
what this is in this case by the way I have implemented
a very nearly
complete subset of JavaScript it's about like
maybe 90 or 95% of the language
using all meta and I have a translator that takes
the JavaScript program and outputs
some small talk and it works on
top of squeak so that's pretty neat to play with but I don't have time
to show that today and I have a couple other
backends for that but in this case what I have is
a JavaScript parser written in nomada that's
about 120 140 lines of code
I don't remember exactly what it is and then after that I
have a translator that pattern
matches on the parse trees that I generate and outputs
JavaScript code again so it's sort of the identity transformation
on JavaScript which on its own is not very interesting
but the neat thing about it is that once you have something
that you can easily because Oh Matta is object-oriented you
can inherit from these grammars and override certain
rules for instance if you want to add a new kind of statement
you override the statement rule and uh and
add some new construct so
I'm gonna skip over that
example but you you guys get the idea so far right okay
and of course O'Meara
is implemented in itself
so
in a very similar style to that
JavaScript translator that I just showed you so
this is the parser for Oh Mehta and underneath
is the code generator which just translates
the parse trees to JavaScript and I have
a little bit of a library that I wrote that's maybe
something like 50 lines of code or whatever that
supports the semantics of Oh meta in JavaScript
and that's the way a bootstrap Oh meta
you you just implement a very
minimal set of semantics as a library in
that host language and then I do the
parsing and cogeneration in no matter but
one interesting thing here is that you don't
have to do this you know very direct two-step
process you can do some more interesting
things with it so for instance the
O meta translator is just taking a parse tree and outputting
JavaScript but my parser is not extremely smart
for instance if you if you have nested
or expressions you know while you're doing your
pattern matching as you often do if I use parentheses you
know I say X or Y or Z
that's going to generate two
or operations you know and the or operation no matter has
remember the position that you're at so that it can backtrack
properly and so on and that's
sort of wasteful so one thing that I did
that
was very straightforward was I implemented
an optimal optimus asian framework
for all meta parse trees in
nomada itself after I had the stuff working and
so what you see on top here is a grammar that
pattern matches on a parse tree for a matter and
by looking at this you can see how little you have to implement
in order to have a language like cometa it's very similar
to Brian for its pegs you know
you only have or and cleani star
cleany plus this is assignment which is used
for binding expressions to names there's
a knot operation look ahead and
so on so it's fairly straightforward anyway
this is kind of like a visitor for a meta parse trees
and on its own it's not very
useful but then down here what I'm doing is I'm ism
inheriting this visitor and overriding the
and and orals so that when I have
nesting you know if I have an end that has an
end as an argument I can pull out the
parts of that end into
the outer end and the semantics is going to be the same except
it's going to be more efficient because I only have you
know to implement that one operation and
and when I implemented this the
the Oh manage is implementation became I think
30 percent faster which was pretty neat for very
minimal effort
anyway oh yeah
and the last thing I want to show you here before I
show the Dan stuff
is so if you if you
end up taking the time to take a look to look at this stuff online
I encourage you to look at this project it's called meta
to t oo which is a pun on you know
the meta to which had the Roman numeral
two that I told you about before and what
this is is a language that's very close to a
meta implemented entirely in this little
project and so at first this
this javascript object that you see up here is
a is the
library like object that implements all
of the semantics that you need for doing the parsing and pattern
matching and then down here what I have
is the parser for this language written
in know meta and then I have a couple of examples
you know with with arithmetic
expressions and so on but then of course that
is not sufficient to do bootstrapping
but what you really need to do is once you have the
the Oh Matt alike language working you
that you want to implement that guy in itself which
is what I do down here and
you can see it's a little bit uglier than no
matter particularly because the semantic actions have
to be enclosed in these funny characters and so
on but I did that so that I wouldn't have to implement
the JavaScript parser as part of this thing right
blindly consume the JavaScript and output
it in the translated code so
anyway that's the implementation
and you can compile the compiler with the original
compiler that I wrote with no meta and it works and then
the thing that really tests it later on is if
you take the the version of the compiler
that was generated from the code up here by using the
no matter you know compiler
if you take that and and compile it
self and replace the thing you're using with
that result and it still works down here you know you're
you're good and that kind of works
it's a neat little example of bootstrapping
if you want to take a look at that later on
so I think oh yeah
one less thing that I want to show here and
take a quick look at this JavaScript
doesn't have multi-line string literals right
but if you look at this I'm actually
using one right here it looks very Python esque and
the reason why I can do that is because
you know this translate code thing my JavaScript
doesn't have to be JavaScript right I have that intermediate
thing the JavaScript parser and and translator
that I showed you before and I can easily inherit
from that and add new kinds of things so that was
traightforward and the the other neat thing about having
things having these languages implemented and no matter
that there's a thing called foreign rule invocation
that allows a grammar to lend its input stream
to another grammar temporarily and
it's an it's an alternative to inheritance for
that you may have noticed that the language that I'm using in
workspace is not just JavaScript and it's not just no
matter it's kind of a mash-up of these two languages and if
you wanted to implement that with single inheritance
you would have to inherit from one of these translators
and then implement the functionality of the other translator
which would be very annoying
to have to do and multiple
inheritance is clearly a bad idea I'm not going to go into that
because of all the you know complications
and and so on and so with
foreign rule invocation I can just have one grammar that doesn't
inherit from either if I don't want to and just says try
parsing my input as a no matter grammar and if
parsing my input as a JavaScript statement or
expression or whatever sorry
punch line yes okay so
what you're looking at here is
part of my translator
for the Niall language that Dan was talking about so
the stuff on top implements is
a is a grammar that on its own implements the offside rule you
know indentation as block structure that you see in Python
and Haskell and so on a neat thing about that is that that's
modularized away from whatever language
that you're implementing so all this stuff like
a keeping track of a stack of indentation levels
and so on that can be done in that one grammar
and my my
Nile parser which is down here also
written in no matter just inherits from that guy
and can just say you know I I want
to I want to indent
I don't know
that's identifier anyway there's there's
there's a bunch of indentation code
ah here we go so I want to I want
to tab over here and then there are these funny unicode
characters that dan is very fond of and
so on so this is the
the parser part of the implementation of Dan's
thing and there's also a
type checker and code generator that are written
by pattern matching on those data
structures that I'm creating on the right hand side and the
so it's a it's a fairly you know that one
of the cool things about Dan's language is that in addition
to you know types being used for
making sure that you don't get things wrong the
the types matter a lot for efficient
code generation for Niall and it
turned out to be relatively straightforward to keep track
of types in these grammars since you know
the grammars are kind of like class grammar
no matter is pretty much like a class in an object-oriented
language you can have state associated with running instances
of that grammar and then you can keep the
symbol table on there and and so on and in
do proper type checking so I think I'm
gonna stop here sorry if I ran a little long and
SM is gonna
nd just to clarify pegs on their
don't support left recursion but there's a cool trick that
you can do with the memo table that's used by packrat parsers
I wrote a paper about that you guys Rangers
you have all these
right so right
now I don't have a great answer to that I try it out and
hopefully it works but we're
hoping that at some point
soon we're going to be able to use the princess and
programs that then has I kind of think of them as
the meaning of the thing that I care about and
it would be really great to have an automatic way
to check the meaning against it's
true okay
we can discuss it offline
you don't need this
please feel free to leave us
time be another ten minutes I'm
Kazama I work at viewpoints as well as
um understood at UCLA actually called my father but
he said you should be out party
so what part of all the interest
is I'm sure you have seen an imperative as well as declarative
methodologies to do programming imperative
is how implementations are darling
declarative is we've seen Prolog language
you know its meanings they're compact
they're very understandable
but we need search in order to to execute
them so that the often not practical part
what your interests are here to save codes as well as to add
understandability to problems
is is there a way to kind of combine both at
he same time which is not very common today this is a famous quote
that I made recently that the natural
combination of the claritin predators is
missing a couple of methodologies
that we kind of studied in this direction is
one is I'm just I'm just gonna quickly go
over it but basically if
you add a layer
of logic on top of your your your
languages to support search basically
your heuristic search you can do imperative
programming with a little bit of overhead if you're always willing
to add optimizations so in the in
the presence of multiple methods if there's a if
there's an optimization that always tells you based on
the state which method is the is the good method to
so you always do this check to to find out the different
alternatives then you have this lovely the overhead
of checking to do your normal imperative but
when you do want when you do have the you
know leverage to do kind
of search you can you have the option
to say well explore possibilities and
find the best best scenario to do so
this is what this this prodigies are all about you
know allows you to kind of separate the optimizations and here
is the from from the actual implementations the actions
that you see here to satisfy the goals so
implements that the budget programs really
small they're really cute and slow but there
have this nice way that you can add
for example this chess program that was about
three hundred lines of code that even included some
text ASCII art graphics
where you could just actually
add any arbitrary number of heuristics that say
based on the the states that you explore basically
heuristics tell you which one is the better one to
do and then these heuristics tell
you that you know it's better to take a move when you actually
capture the opponent's piece then try not
to get captures we're adding some numeric values so you
can basically reorder the you know alternatives
but
another more serious one was you know
in compilers need to do register allocations
which is you have registers and you have variables
in the program these variables the compiler needs to decide
where they gonna be in on the registers and this has
happens to be an NP complete problem this hard problem
you want to do it optimally and then it's this is nice way
because you know you just allow a high-level
goal that that kind of this this is using
methodology in Smalltalk program that you just expressed
the goal that all variables want to be you
know assigned and you then you
have you tell it what what actions
are taken when you're doing this this task which is usually
assigning particular register with the
variable or deciding to spill or split and then
you define the optimizations on top of this that says based
on whatever state you are if you have room for
the next variable then take that particular method if
you have otherwise take the spill method
you know it just in has a layer on top of
your normal logic you can express that
in more declarative way
but the problem with here is success as
you know is that if you're implementing your own search it's
interactable so recent
or studies then I'm going to show now is more
based on boolean satisfiability
sad solvers constraint solving
technologies which are recently been shown that
it's much you know it's a
scale much better so our Britta more recent
stuff is not based on the search that we come up with ourselves but
based kind of plugging or whatever
a tool that we have two external tools that are
really good at solving problems using sad solving technologies
as we will see that they have their
own problems that they're kind of not very and they're
not nice to adding user specific
heuristics so eventually
we want some combination of the two two ways
the heuristic searching or sad solving
technologies to rethink that you know might answer
our problem so this this project
is about adding syntactic sugars to programming
languages in this case I'm showing you examples from an
extension of Java language where we allowing
first-order logic expressions like this
or you know set comprehension stuff like this
this is from a lloyd language actually
it's a model object modeling language so
we we allow these kind of expressions in the language
but they serve two purposes one is that we can
easily do sugar these two low-level syntax
of the program so these are actually runnable so if I
but this is actually a part of the code
of the binary tree and if I actually instantiate
an object and I run it you know because
it's gonna be compiled to a low-level code is going to
be runnable even though it's compact but that's
the only thing that we're interested to save lines of code we're
also interested in having a
runnable executed runnable meanings meanings that
are actually executable you see when we say
meanings we mean high-level expressions like this that says
I want a graph that is
a cyclic then this is the meaning of a cyclic
that there's no note that that kind of points back to itself right
nice to be able to without any implementation
execute this meaning that not
just running in the program that kind of using
constraint solving technology to actually find a
satisfying state for that particular
constraint so so that's
what I'm saying executable in terms of we can use some
external tool to get to what state that
we're happy without actually any implementations now we're going to use
that for two things one
the software reliability I'm showing you an
implementation of the bubble sort for in this this
extension that is pretty routine it's a bubble
sort routine but we see that you
know this is in the in light of you
know program annotations you've seen in a Java modeling language
and others where you just annotate the program
with meanings basically which says
this particular method is accomplishing this
this expression this is followed by that insurance
clause that says the job of the sort method is
that it's going to find some value that's a permutation
of the the old alts all
this and it's gonna be a sorted this is something
that we can automatically insert instrument in the in the compiled
version so that if when method of methods are
done these constraints of these
expression is going to be evaluated and if
they happen to be false that means the method was
faulty method was buggy or if they
if they even happened to have some kind of runtime
failure like a division by zero that
it's not easily found we can kind
of catch that exception and use the
exec use these efficacious as
kind of a fallback as an alternative
that the method didn't work but we can have an
alternative to to try what accomplish
what we want and in this case we
can use we kind of translate this
expression into the synthetic the logic syntax
where we can can call an external tool to to find basically
sort this list
so it's a live but bigger
example is this an implementation of red-black
tree that also have annotations for
the insert and delete methods basically
saying on an insert insert you just have one
one you note added to the set of the notes and
I delete you have one less and
basically assuming they have some
implementation that's not reliable you can just you
know use this the annotations to
accomplish what what they do actually you
can think of this so
I found the implementation typically about five
six hundred lines of code or three hundred lines of code of Java
because of a lot of corner cases that these balance
trees need to satisfy this
you can think of it as a kind of a minimalistic
runnable implementation as you will it's
like an interface for for the whoever wants
to implement it but the neat thing that this
has that the interfaces don't have its runnable so you can have
actually write what these two methods are gonna be
in the future but because you have annotated
them actually you can we can test them and run them because
yeah you know we have that underlying
reasoning system so that you actually without
the implementation can test your program
interface to to
move around another so
this is the last thing I'm showing is that you can
always intentionally use this fallback
system the use of fallback
when you want to do something bit clearer deeply you don't you
don't even want to implement something but
you know you can use the constraint solving technology that kind of
embedded in your in your language to accomplish
something that's kind of cumbersome
so in this case for example this is a it's a a
GUI example where this
is very simple a user just puts the dots on the
screen you see these four dots are putting in
screen without a problem so the the the job of this
particular method the ad body is very trivial
it's just you know draws the dot and this box
shows that there's this restriction
on this this application
that says a user should not put two dots
too close to each other there's a minimum distance that they might have
so you imagine the the normal
you know invocation of this method is
very trivial it just draws it as as far as those
violations are not you know those specifications
it's okay it just but
under a corner case is when actually somebody
put a red dot here and that kind of you
know violated that property you
don't want you don't want to implement that that's a that's a hard
thing to do potentially a lot of dots needs to be moved around right so
in that case you just don't
handle it in your in your program and
we're gonna kind of you can't automatically fall
back to constraint solving in a sad solving - to do what
- basically shift some dots around - to
properties and that's what's happening here
and you see that that box
here is saying there's a restriction of how much those
dots are you know are allowed to shift by by
the by the method so you see that you know
these two are shifted a little bit you know to
separation is happening so basically I'm showing
you the code that that does this
this is the this is the
entire code that when you're when you're adding a dot you see
that it has this this
check that it needs to do that I'm written
here also in a in a high level syntax
of alloy the first order logic that they
have the two things that says no to dust needs to be too
you know too close to each other and when you do move them you know
you're not allowed to remove them by more than an
a value and you see that there's no implementations here so
that when I put that red dot you know it kind
of automatically figured out
those are the things that just wanted to show you so
if you have any questions or for
previous
before
basically
but also
II think that we don't have a cat
the other
find that
like