
Edward Kmett
Research Engineer
Machine Intelligence Research Institute
location_on United States
Member since 3 years
Edward Kmett
Specialises In
Edward spent most of his adult life trying to build reusable code in imperative languages before realizing he was building castles in sand. He converted to Haskell in 2006 while searching for better building materials. He now chairs the Haskell core libraries committee, collaborates with hundreds of other developers on over 250 projects on github, works on ways to try to better scale functional programming, logic programming and formal methods at the Machine Intelligence Research Institute, and is obsessed with finding better tools so that seven years from now he won’t be stuck solving the same problems with the same tools he was stuck using seven years ago.
-
keyboard_arrow_down
Cadenza: Building Fast Functional Languages Fast
Edward KmettResearch EngineerMachine Intelligence Research Instituteschedule 9 months ago
Sold Out!45 Mins
Invited Talk
Intermediate
In this talk Ed will give live coding introduction to normalization by evaluation. He will then show how Graal and Truffle, on the JVM, can be (ab)used to JIT functional languages. He discussesd why this seems like a promising direction for evaluating dependently typed languages in particular.
-
keyboard_arrow_down
Q & A Session With Functional Conf Speakers
Naresh JainFounderXnsioEdward KmettResearch EngineerMachine Intelligence Research Instituteschedule 1 year ago
Sold Out!45 Mins
Keynote
Beginner
During the conference you might have had questions that did not get answered, this is your opportunity to get them answered by our expert panel group
-
keyboard_arrow_down
Propagators
Edward KmettResearch EngineerMachine Intelligence Research Instituteschedule 1 year ago
Sold Out!60 Mins
Keynote
Intermediate
There are a lot of algorithms that revolve around iterating a form of information propagation until it attains a deterministic fixed point. CRDTs, Datalog, SAT solving, functional reactive programming, and constraint programming all fit into this mold.
One framework for these sorts of algorithms is the notion of a “propagator” due to Sussman and Radul, but until now little rigor has applied to know how such algorithms terminate with consistent results.
Another framework is Lindsey Kuper’s work on the notion of “lattice variables” (LVars), which addresses termination, parallelism and eventual consistency well, but not iteration.
By blending these frameworks, I’ll build up a series of sufficient conditions for propagators to terminate with consistent results and proceed to show how we can use this common framework to steal insights and quirks from each individual domain to try to optimize the rest.
-
keyboard_arrow_down
Functionally Oblivious and Succinct
Edward KmettResearch EngineerMachine Intelligence Research Instituteschedule 1 year ago
Sold Out!30 Mins
Talk
Advanced
This talk provides a whirlwind tour of some new types of functional data structures and their applications.
Cache-oblivious algorithms let us perform optimally for all cache levels in your system at the same time by optimizing for one cache for which we don’t know the parameters. While Okasaki’s “Purely Functional Data Structures” taught us how to reason about asymptotic performance in a lazy language like Haskell, reasoning about cache-oblivious algorithms requires some new techniques.
Succinct data structures let us work directly on near-optimally compressed data representations without decompressing.
How can derive new functional data structures from these techniques? Applications include just diverse areas as speeding up something like Haskell’s venerable Data.Map, handling “big data” on disk without tuning for hardware, and parsing JSON faster in less memory.
-
keyboard_arrow_down
Discrimination is Wrong: Improving Productivity
Edward KmettResearch EngineerMachine Intelligence Research Instituteschedule 1 year ago
Sold Out!60 Mins
Keynote
Advanced
This talk is a case study in library design in Haskell.
Fritz Henglein has shown through a number of excellent papers how to use “discrimination” to do lots of things in O(n): Sorting many more data types than you’d expect, table joins, etc.
In the process of optimizing this approach and wrapping it up in a form that can be easily consumed, we’ll take a lot of detours through the different ways you can think about code when optimizing Haskell.
- We’ll need some category theory, from a deeper understanding of monoids to Day convolution.
- We’ll need to consider final and initial encodings.
- We’ll need to drift down to low level system concerns from building a custom foreign prim to nesting unsafePerformIO within unsafePerformIO.
- We’ll need properties of laziness from productivity to IVars.
Along the way we’ll find and fix a small problem with the initial discrimination paper, which opens the door to streaming results, rather than having to wait until all the input is ready.
-
keyboard_arrow_down
Propagators
Edward KmettResearch EngineerMachine Intelligence Research Instituteschedule 1 year ago
Sold Out!45 Mins
Demonstration
Intermediate
There are a lot of algorithms that revolve around iterating a form of information propagation until it attains a deterministic fixed point. CRDTs, Datalog, SAT solving, functional reactive programming, and constraint programming all fit into this mold.
One framework for these sorts of algorithms is the notion of a “propagator” due to Sussman and Radul, but until now little rigor has applied to know how such algorithms terminate with consistent results. Another framework is Lindsey Kuper’s work on the notion of “lattice variables” (LVars), which addresses termination, parallelism and eventual consistency well, but not iteration.
By blending these frameworks, I’ll build up a series of sufficient conditions for propagators to terminate with consistent results and proceed to show how we can use this common framework to steal insights and quirks from each individual domain to try to optimize the rest.
-
keyboard_arrow_down
Logic Programming à la Carte
Edward KmettResearch EngineerMachine Intelligence Research Instituteschedule 1 year ago
Sold Out!45 Mins
Keynote
Intermediate
I've been working on a logic programming framework in Haskell, called guanxi (關係) with an eye towards scalability. To build it I leaned heavily on my previous work on propagators and a bunch of other little bits and pieces of algebra and category theory in the design process. A number of patterns have arisen repeatedly throughout the process of building this library. I'll give a tour through the current state of guanxi and try to extract some of the more reusable bits of its design for your inspection.
-
keyboard_arrow_down
Propagators
Edward KmettResearch EngineerMachine Intelligence Research Instituteschedule 1 year ago
Sold Out!120 Mins
Combo
Intermediate
There are a lot of algorithms that revolve around iterating a form of information propagation until it attains a deterministic fixed point. CRDTs, Datalog, SAT solving, functional reactive programming, and constraint programming all fit into this mold.
One framework for these sorts of algorithms is the notion of a “propagator” due to Sussman and Radul, but until now little rigor has applied to know how such algorithms terminate with consistent results. Another framework is Lindsey Kuper’s work on the notion of “lattice variables” (LVars), which addresses termination, parallelism and eventual consistency well, but not iteration.
By blending these frameworks, I’ll build up a series of sufficient conditions for propagators to terminate with consistent results and proceed to show how we can use this common framework to steal insights and quirks from each individual domain to try to optimize the rest.
-
keyboard_arrow_down
Transients
Edward KmettResearch EngineerMachine Intelligence Research Instituteschedule 1 year ago
Sold Out!45 Mins
Invited Talk
Beginner
Haskell often lends other programming languages ideas, but what can Haskell learn from other functional programming languages?
In this talk I’ll adapt Clojure’s transients to Haskell, and fix them up along the way with the help of Haskell’s type system features.
Transients are immutable data structures that can be made mutable in O(1), and frozen again in O(1). Unlike with Haskell’s arrays, which support superficially similar freeze and thaw operations, this doesn’t ‘destroy’ the frozen structure.
I’ll show the naive translation, a not-so-dirty trick to make it fast, how to layer a more Haskell-like mutable API on top, then borrow some more improvements from recent work on similar structures in Scala and Clojure.
-
keyboard_arrow_down
Let's Lens
Edward KmettResearch EngineerMachine Intelligence Research InstituteTony MorrisSoftware EngineerSimple Machinesschedule 1 year ago
Sold Out!480 Mins
Workshop
Intermediate
Let's Lens presents a series of exercises, in a similar format to the Data61 functional programming course material. The subject of the exercises is around the concept of lenses, initially proposed by Foster et al., to solve the view-update problem of relational databases.
The theories around lenses have been advanced significantly in recent years, resulting in a library, implemented in Haskell, called lens.
This workshop will take you through the basic definition of the lens data structure and its related structures such as traversals and prisms. Following this we implement some of the low-level lens library, then go on to discuss and solve a practical problem that uses all of these structures.
-
keyboard_arrow_down
Functionally Oblivious and Succinct
Edward KmettResearch EngineerMachine Intelligence Research Instituteschedule 1 year ago
Sold Out!60 Mins
Talk
Advanced
This talk provides a whirlwind tour of some new types of functional data structures and their applications.
Cache-oblivious algorithms let us perform optimally for all cache levels in your system at the same time by optimizing for one cache for which we don’t know the parameters. While Okasaki’s “Purely Functional Data Structures” taught us how to reason about asymptotic performance in a lazy language like Haskell, reasoning about cache-oblivious algorithms requires some new techniques.
Succinct data structures let us work directly on near-optimally compressed data representations without decompressing.
How can derive new functional data structures from these techniques? Applications include just diverse areas as speeding up something like Haskell’s venerable Data.Map, handling “big data” on disk without tuning for hardware, and parsing JSON faster in less memory.
-
keyboard_arrow_down
Functionally Oblivious and Succinct
Edward KmettResearch EngineerMachine Intelligence Research Instituteschedule 1 year ago
Sold Out!60 Mins
Talk
Advanced
This talk provides a whirlwind tour of some new types of functional data structures and their applications.
Cache-oblivious algorithms let us perform optimally for all cache levels in your system at the same time by optimizing for one cache for which we don’t know the parameters. While Okasaki’s “Purely Functional Data Structures” taught us how to reason about asymptotic performance in a lazy language like Haskell, reasoning about cache-oblivious algorithms requires some new techniques.
Succinct data structures let us work directly on near-optimally compressed data representations without decompressing.
How can derive new functional data structures from these techniques? Applications include just diverse areas as speeding up something like Haskell’s venerable Data.Map, handling “big data” on disk without tuning for hardware, and parsing JSON faster in less memory.
-
keyboard_arrow_down
Functionally Oblivious and Succinct
Edward KmettResearch EngineerMachine Intelligence Research Instituteschedule 1 year ago
Sold Out!60 Mins
Talk
Advanced
This talk provides a whirlwind tour of some new types of functional data structures and their applications.
Cache-oblivious algorithms let us perform optimally for all cache levels in your system at the same time by optimizing for one cache for which we don’t know the parameters. While Okasaki’s “Purely Functional Data Structures” taught us how to reason about asymptotic performance in a lazy language like Haskell, reasoning about cache-oblivious algorithms requires some new techniques.
Succinct data structures let us work directly on near-optimally compressed data representations without decompressing.
How can derive new functional data structures from these techniques? Applications include just diverse areas as speeding up something like Haskell’s venerable Data.Map, handling “big data” on disk without tuning for hardware, and parsing JSON faster in less memory.
-
keyboard_arrow_down
Let's Lens
Edward KmettResearch EngineerMachine Intelligence Research InstituteTony MorrisSoftware EngineerSimple Machinesschedule 2 years ago
Sold Out!480 Mins
Introductory Workshop
Beginner
Let's Lens presents a series of exercises, in a similar format to the Data61 functional programming course material. The subject of the exercises is around the concept of lenses, initially proposed by Foster et al., to solve the view-update problem of relational databases.
The theories around lenses have been advanced significantly in recent years, resulting in a library, implemented in Haskell, called lens.
This workshop will take you through the basic definition of the lens data structure and its related structures such as traversals and prisms. Following this we implement some of the low-level lens library, then go on to discuss and solve a practical problem that uses all of these structures.
-
keyboard_arrow_down
Logic Programming à la Carte
Edward KmettResearch EngineerMachine Intelligence Research Instituteschedule 2 years ago
Sold Out!30 Mins
Invited Talk
Beginner
I've been working on a logic programming framework in Haskell, called guanxi (關係) with an eye towards scalability. To build it I leaned heavily on my previous work on propagators and a bunch of other little bits and pieces of algebra and category theory in the design process. A number of patterns have arisen repeatedly throughout the process of building this library. I'll give a tour through the current state of guanxi and try to extract some of the more reusable bits of its design for your inspection. -
keyboard_arrow_down
Let's Lens
Tony MorrisSoftware EngineerSimple MachinesEdward KmettResearch EngineerMachine Intelligence Research Instituteschedule 3 years ago
Sold Out!480 Mins
Introductory Workshop
Intermediate
Let's Lens presents a series of exercises, in a similar format to the Data61 functional programming course material. The subject of the exercises is around the concept of lenses, initially proposed by Foster et al., to solve the view-update problem of relational databases.
The theories around lenses have been advanced significantly in recent years, resulting in a library, implemented in Haskell, called lens.
This workshop will take you through the basic definition of the lens data structure and its related structures such as traversals and prisms. Following this we implement some of the low-level lens library, then go on to discuss and solve a practical problem that uses all of these structures.
-
keyboard_arrow_down
Combinators Revisited
Edward KmettResearch EngineerMachine Intelligence Research Instituteschedule 3 years ago
Sold Out!45 Mins
Invited Talk
Intermediate
Back in the 80's, one approach to compiling functional programming languages was to compile down to combinators such as SKI. John Hughes' initial work on supercombinators changed the way folks thought about compiling functional languages and caused folks to turn away from this approach by customizing the combinator set to your particular program. Then Lennart Augustsson's work on implementing supercombinators more efficiently sealed the deal. GHC's compilation technique is a descendant of this school of thought.
But what did we give up to get to where we are? Let's explore a bit of alternate history.
-
No more submissions exist.
-
No more submissions exist.