Back to chats Igalia's Brian Kardell and Eric Meyer chat with Oliver Medhurst (CanadaHonk) about their novel AOT JavaScript Engine, Porffor.

0:00

Transcription

  • Brian Kardell: Okay. Hi, I'm Brian Kardell. I'm a developer advocate at Igalia.
  • Eric Meyer: And I'm Eric Meyer, also a developer advocate at Igalia. And this week, we have a special guest. Oliver Medhurst, please introduce yourself.
  • Oliver Medhurst: Yeah. I'm Oliver Medhurst. I used to work on Firefox and I'm now doing my own thing full time.
  • Eric Meyer: Nice. Yeah.
  • Brian Kardell: I got to know you by writing blog posts that were examining the commit histories. I was fascinated because there were a few people who show up among the companies. Do you know what I mean? So there's like, oh look, there's Red Hat, there's Igalia, there's Sony, and then there's just a couple of individuals, and you were among those. I think you had, I went and looked it up last night, you had 4.28% of all contributions to Mozilla Central in 2023. It was kind of astounding.
  • Oliver Medhurst: Yeah, I remember seeing your post and I was also surprised by that number. I guess it doesn't feel like a lot just doing one at a time.
  • Brian Kardell: But it was clearly. It's really cool. Clearly you do a kind of a Herculean amount of commits and work and we invite you in not just because of that, but because with all that Herculean work energy that you have. We also have been doing this series on web ecosystem health, and we have this subseries in that called Novel Browsers and Engines, and so far that's meant web browsers, but now we're going to also start adding JavaScript engines to that list. So maybe you can explain the state of the JavaScript engines and how you fit into that?
  • Oliver Medhurst: Sure. I guess from an industry perspective, you have the main three. V8 JavaScriptCore and SpiderMonkey. Well, I think V8 is probably the most notorious being used in Chromium and Node and embedded in many things we wouldn't expect.
  • Brian Kardell: JSC is now in Bun, right?
  • Oliver Medhurst: Yeah. It's interesting. I guess upcoming alternative kind of to Node, focusing I guess on startup things, trying to push forward its own values, which is nice to see something new.
  • Eric Meyer: So, sorry. So we have Node adopting V8 from Chromium, and then Bun is adopting JavaScriptCore from WebKit? Is that-
  • Oliver Medhurst: Yeah.
  • Eric Meyer: Cool. Is SpiderMonkey being adopted by anybody?
  • Oliver Medhurst: I don't think in such a major project like this.
  • Eric Meyer: But it does show up in other places?
  • Oliver Medhurst: Yeah, I think it's used a lot in WebAssembly, which I guess we'll talk about later.
  • Eric Meyer: I don't know anything about this stuff. I know JavaScript, I can write a little bit of JavaScript. Sometimes I can't even write it in a not terrible way. But you are writing more than JavaScript. What are you writing and what is it called? I can't pronounce it.
  • Oliver Medhurst: Yeah, so I'm writing, I guess a JS engine called Porffor, which is unique because it compiles JS ahead-of-time to WebAssembly rather than other engines which exists, which do just-in-time compiling or interpreting.
  • Brian Kardell: So it's a little bit like Java and Bytecode, I guess, right? You're just compiling down to WebAssembly and delivering the WebAssembly because it's smaller, I guess, and faster?
  • Oliver Medhurst: Yeah, I guess it's kind of like that where I guess I hope WebAssembly is more of a universal (in air quotes) binary than JVM or something.
  • Eric Meyer: Okay. So before we go any further, Porffor, did I get that right?
  • Oliver Medhurst: Yeah.
  • Eric Meyer: P-O-R-F-F-O-R, apparently is a Welsh word. What does it mean and why'd you pick that?
  • Oliver Medhurst: It means purple.
  • Eric Meyer: Okay.
  • Oliver Medhurst: And I picked it because no other JavaScript engine [logo] is purple colored.
  • Eric Meyer: That's a perfectly good reason. I actually like that reason better than if it had some convoluted backstory about action figures or something. It's like, yep, none of the others are purple. I like purple. We'll call it that, but I'll use Welsh. Cool. And you said that it's a ahead-of-time compiler instead of a just-in-time compiler. Why did you make that choice?
  • Oliver Medhurst: I guess when I first started it was mostly just for fun as a research project, because I think some people have tried before but not really succeeded, at least when I started making it about a year ago. So it's quite easy to dismiss the idea that it's infeasible, but I considered it and I thought it was possible and I thought I might as well spend my free time trying it out.
  • Brian Kardell: There have been for a really long time, transpilers, right?
  • Oliver Medhurst: Yeah.
  • Brian Kardell: Popular ones. And some of them can create Wasm at least, but none of them start with JavaScript, right?
  • Oliver Medhurst: Yeah.
  • Brian Kardell: So that's really interesting. I'm curious because I honestly, I know that Wasm originated back in Mozilla Labs. I think it was Dave Herman, his team came up originally with ASM, right? It was just ASM and it was like a limited bit of JavaScript that you can, I believe, annotate more or less?
  • Oliver Medhurst: I think when Emscripten, which was trying to compile C++ to JS and stuff, that was first conceived as that ASM.js format because I think originally, I'm not quite sure on the full history, but I think originally it was just transpiled to JavaScript, but the performance was not amazing.
  • Brian Kardell: I don't really know. I remember way back in 2014 or whatever, there were a lot of demos and I think those were not even ASM yet, but there were some games and thing, but yeah, I don't really know, but I know that the ASM stuff came from how do you do the optimizations, and then I think Wasm came from that. We should really know this, but I know that Lynn Clark was also involved in that. I don't know if you know Lynn, but she lived here in Pittsburgh where I live, seen her give a couple of talks here in my home city, which is cool. So Porffor is different. It is itself written in JavaScript, so it's like JavaScript-ception sort of?
  • Oliver Medhurst: Yeah, it's written mostly in JavaScript with a bit of TypeScript for some interesting stuff, which I won't get into, but yeah, so it could theoretically self-host itself, some compilers do what it could run itself with itself, which would be very interesting, but I'm not that conformant yet.
  • Brian Kardell: Yeah. I just actually was going to mention the same thing about self-hosting. Can you just for, I don't know, we have probably a diverse audience I would imagine, so can you explain a sketch of self-hosting for people who might not know?
  • Oliver Medhurst: Yeah, so I guess more in the compiled languages space is where once you've written the compiler for a new language you're making or something, where you could write that compiler in its own language and then compile with itself, which is I guess kind of good for Dogfooding in a way as you are trying out your own language and compiler. And also it helps just, I guess inspire the language.
  • Eric Meyer: Although it reminds me of a number of classic myths about worms eating their own tails and that sort of thing.
  • Oliver Medhurst: For sure. It's a big rabbit hole.
  • Eric Meyer: And it always kind of blows my mind. It's like, okay, you wrote the thing, the thing you wrote can make itself. Wow, okay, wait, mind blown. I don't understand how does that work? But this is how we get compilers. This is how we get all the tooling that we have is people who can figure this stuff out like you. But why did you write a JavaScript compiler in JavaScript? What was the advantage there?
  • Oliver Medhurst: I guess there are two reasons. One was for that potential self-hosting benefit longterm, and I guess the other was just for fun because I can. And I guess in a way JavaScript is probably the language I know best, and I think once you know it well, you can just kind of bend it to your will in a way.
  • Brian Kardell: So there's a little bit of why did you climb Mount Everest? Because it was there kind of?
  • Oliver Medhurst: Yeah.
  • Brian Kardell: Yeah?
  • Eric Meyer: That sort of thing just really fascinates me. I would understand writing... For example, Processing, it's a Processing interpreter that was written in JavaScript, or at least there are versions of Processing that are written in JavaScript so that you can use JavaScript to interpret a different language. But you're using JavaScript to interpret JavaScript, and that just always fascinates me when people are able to do that kind of thing. The ability to do that just completely floors me, and then the wanting to do that is also just really intriguing to me.
  • Oliver Medhurst: Yeah, I guess to mention some prior work, there is, I don't know if you know engine262, which is a JS engine written in JS, which is-
  • Eric Meyer: No, I know about test262, but I didn't know about engine262. Where does that come from?
  • Oliver Medhurst: Yeah, that's from, oh, I forgot the history of it, but it's mostly designed I think to help with just exploring the language with a nice-to-use playground. You can just easily modify the language itself. But yeah, I guess it's kind of similar to that in the regard of that both run in JS, running JS, but that interprets whereas I compile, so it is quite different internally, I guess.
  • Eric Meyer: Okay. I have to take your word for it. But I want to go back to the, I don't know, maybe you feel like you answered this, but I don't know if I grasped it yet. You wanted to do the ahead-of-time compile. Are there advantages to doing ahead-of-time versus just-in-time? Are there any drawbacks?
  • Oliver Medhurst: Yeah, I think they are definitely two different things for different uses. I think with ahead-of-time, I think it's at least potentially very good for if you have some JavaScript you already know well in advance, if you're running it on a server or something. Because JITs are good, just-in-time compilers are good as they have a very small startup or compile cost, whereas most ahead-of-time compilers, if you've tried other compiling across project before or something, they can take seconds sometimes and you wouldn't really want your browser spending seconds and if you open a website just compiling JavaScript.
  • Brian Kardell: Once it's compiled, this leads to my next question though. So on the website it says, and then to native.
  • Oliver Medhurst: Yeah, I guess I also have a feature in the engine where once it's generated the WebAssembly, it can then compile that to C. then I can pass that to Clang or something to make real native binaries.
  • Brian Kardell: Wow, that's incredible. It's basically the opposite of Emscripten and stuff, right? Where you're taking C and outputting?
  • Oliver Medhurst: Yeah.
  • Brian Kardell: Can you do the whole snake eat its tail where you have one feed the other and have the other one feed back and see how long it goes?
  • Oliver Medhurst: Yeah, I haven't-
  • Brian Kardell: And it's probably-
  • Oliver Medhurst: ... tried that but it's definitely an interesting idea. Yeah, you could compile it to C and then compile that back to WebAssembly, and I'll just do that forever.
  • Brian Kardell: Just to see what happens. Like if it's lossless?
  • Oliver Medhurst: Yeah.
  • Eric Meyer: Either that or that's how we get Skynet. So the ahead-of-time, the advantage there, if I understand you correctly, is that if you have something that you don't need to do sort of in real time, if you have something that you know, okay, I want to just compile this to a binary of some sort or an assembly package, and then I'm going to put it on a server, or for that matter, you could, if it were, I suppose small enough, you could send it to a browser because you've taken all of the compile time sort of asynchronously, right? You've done it before anybody actually runs the code and then it's done. Nothing ever has to be compiled again with that particular chunk of code. Whereas the just-in-time compilers and browsers is because they're getting chunks of JavaScript and there's interaction and everything needs to be done really, really quickly. Like you said, there's very little startup time, but they have to sort of compile as they go in order to not have everything grind to a halt. So what do you think of as good use cases for the ahead-of-time compile strategy? What would you be coding or what could you be coding that would benefit from ahead-of-time?
  • Oliver Medhurst: Yeah, I think with especially service side JavaScript, it has good potential.
  • Brian Kardell: To draw a sort of silly analogy almost, we have this static site generators, you can generate these static sites. Alternatively, you could build those pages on every request and then you're doing the processing every time. That time that those threads are blocked, that that memory is used, whatever, that's longer, so it affects its overall performance and scalability. So if you imagine the same thing, but with JavaScript code, especially code that would be in that same kind of situation would be trying to process things on the server or even during a build maybe. You could probably speed up builds as well. Do you think that that's a poor analogy or silly?
  • Oliver Medhurst: No, I think it's a good analogy. Yeah, it's the kind of server-side generation. Yeah, I do think that this is kind of similar to that.
  • Brian Kardell: The fastest work is the stuff that you already did, right?
  • Oliver Medhurst: For sure. Yeah. I guess kind of related to that, I have noticed that when compiling natively, at least for smaller, simpler apps, nice benefit is the memory usage. Since if you run it with no JS or Bun or something, since that's using a JIT compiler, it can use, well necessarily up to at least 40 megabytes at once, which isn't terrible, but when natively compiled, I think my tool uses one.
  • Brian Kardell: Yeah, yeah, yeah.
  • Eric Meyer: Okay.
  • Oliver Medhurst: So for resource constraints.
  • Eric Meyer: Yeah.
  • Brian Kardell: So maybe it would be good for embedded systems too.
  • Oliver Medhurst: Yeah, that's definitely a good possibility.
  • Brian Kardell: Yeah. So you run test262 on this engine?
  • Oliver Medhurst: Yep.
  • Brian Kardell: Which surprised me that you got that all working already and that you have nice graphs and everything. Yeah. How are you doing on that?
  • Oliver Medhurst: Yeah, I'm currently passing 35% of it, which is a nice round number.
  • Brian Kardell: I always wonder with these things, do you think that some of your performance is due to the fact that you only passed... If you think that as you get more and more and more conformant, your gains will slow down?
  • Oliver Medhurst: Yeah, that's definitely a concern I have. I'm making sure to... I guess my main focus is conformance, but I'm trying not to just rush carelessly I guess. Well, I added classes recently, which is a relatively isolated feature, but just ensuring whilst doing it that I'm not slowing anything existing down.
  • Brian Kardell: I was looking at the charts on your website and some of the, well, let's say the website is Porffor, which is tricky to spell if you're not Welsh I suppose. P-O-R-F-F-O-R.com, and in here you say Wasm size 32 times smaller than Javi?
  • Oliver Medhurst: Javi is a Bytecode alliance project which compiles JS to WebAssembly, but it doesn't compile in the same way that Porffor does. Where instead of ahead-of-time compiling it bundles, I think quick JS, like a JavaScript interpreter into the Wasm binary and just bundles the source code with that.
  • Brian Kardell: Okay, so you are way faster than that.
  • Oliver Medhurst: Yeah.
  • Brian Kardell: So far.
  • Oliver Medhurst: Yeah, I guess that's because instead of interpreting, I'm fully compiling, so that will probably always be much better, especially in WebAssembly where you're already in a kind of resource constrained sandbox environment.
  • Eric Meyer: For me, this is just a level of work that is hard for me to even comprehend. How big is test262 anyway? I don't know what passing 35% actually means.
  • Oliver Medhurst: I think test262 has, I want to say around 50,000 tests individually.
  • Eric Meyer: Okay.
  • Oliver Medhurst: But a lot of those, I wouldn't say it's very fairly distributed because I want to say at least probably 5,000 of those is temporal, which is interesting.
  • Eric Meyer: Oh yeah, I hear some of those are coming out soon anyway.
  • Brian Kardell: Web platform tests is like that as well. There's, I don't know, a few hundred thousand tests that are just about encoding or something like that. Sub-tests. So yeah, it's not evenly distributed tests. I think anybody who does testing knows that that's probably the case when you're talking about individual tests and sub-tests. Well, I don't know if you still can, but you could in the past run the test262 in your browser and it took a long time.
  • Eric Meyer: When you say long time?
  • Brian Kardell: I never had the patience to wait for it to finish.
  • Eric Meyer: Okay. What's the longest you ever waited?
  • Brian Kardell: I don't know. I don't want to go on the record and say how long because I'm probably wrong. In all tests, you get where you have that many tests, some of them time out and then I'm like, well shoot, that's the one I was interested in.
  • Eric Meyer: Okay. So how long does it take to run262 in Porffor?
  • Oliver Medhurst: Well, I feel that's interesting because I want to say around a month ago, it only took probably five minutes, but since conformance has improved, it takes around 20 minutes now.
  • Eric Meyer: Okay.
  • Oliver Medhurst: Because everything stops failing so fast.
  • Eric Meyer: Okay. Right, fair enough. But then, at a certain point, I guess you'll have to start it at night and then see in the morning how long it took or if it failed out or whatever, that's what browser people do.
  • Oliver Medhurst: Yeah, I do, I guess to say a related project. I run test262.fyi, which is kind of like wpt.fyi shows test262 for many engines, and I run that daily at night and it takes let's say six hours for every single engine to run in parallel.
  • Eric Meyer: Wow.
  • Brian Kardell: Wow. Today I learned about test262.fyi. That's pretty cool.
  • Eric Meyer: So what that means is the longest run, whichever JavaScript engine takes the longest to get through all of 262, since they're running a parallel, takes about six hours?
  • Oliver Medhurst: Yeah, that's actually mine, embarrassingly, but that's just because the way it runs, it has to spin up no GS every single time, which is probably half of that time since it's 50,000 tests let's say.
  • Eric Meyer: Right.
  • Brian Kardell: But also would you not expect it to because it's doing the work upfront, right?
  • Oliver Medhurst: Yeah.
  • Brian Kardell: So you would expect it to take longer anyway because it's analyzing harder, right? It's looking for efficiencies, right? Yeah. I say looking for efficiencies like it's an intelligent thing when it's not even that. Now that we have LLMs, we need to be careful about our language. But it's interesting that Rhino and Nashorn are on here. Rhino is the original one that was packaged in the Java engine. Through the Java engine you could execute JavaScript. And yeah, a long, long time ago I did a project that was basically a server that used JavaScript but was hosted in Java and the performance was very, very, very terrible. And it takes a long time on here, but not as bad as Nashorn, which I think is the replacement that everybody said was going to be so much better. Is it really slower, if Nashorn is slower than Rhino?
  • Oliver Medhurst: I guess I wouldn't take how long it takes to run test262 as a benchmark, but I wouldn't be that surprised.
  • Brian Kardell: That's interesting.
  • Oliver Medhurst: I think that both not excellently maintained nowadays.
  • Brian Kardell: Okay. A thing that's interesting to me is that you compile to Wasm, right? But Wasm as itself, it's not like Bytecode, right? You could write Wasm, right?
  • Oliver Medhurst: Yeah. Yeah. I have for this for better or for worse, but yeah, you could write it by hand, but I would not suggest anyone does. Wasm does have a binary format.
  • Brian Kardell: Oh, it does have binary format?
  • Oliver Medhurst: Yeah, which most things use. Yeah, I think you mean W-A-T, WAT, which is the text-based format, which some people write in, which is used a lot for debugging and some ecosystem tools. But for actually shipping to browsers and stuff, people use the binary format, which can play with generate.
  • Brian Kardell: It is actually a lot like ByteCode.
  • Oliver Medhurst: Yeah.
  • Brian Kardell: It's just a different VM basically.
  • Oliver Medhurst: Yeah.
  • Brian Kardell: So why did we invent a new VM? What was wrong with the JVM?
  • Eric Meyer: Sort of like the XKCD. We have all these VMs, we need a better VM. Now you have 15 VMs, probably more than that, but anyway. So Brian was telling me that you recently got some funding.
  • Oliver Medhurst: Yes, I am from essentially August, I will now be working on this full time, which is very exciting.
  • Eric Meyer: So was that a crowdfunding campaign or where did that money come from?
  • Oliver Medhurst: Yeah, I'm being funded by Chris Wanstrath, who is a former GitHub, CEO and co-founder for a unannounced project.
  • Eric Meyer: Okay.
  • Oliver Medhurst: So fully funded through that.
  • Eric Meyer: Nice.
  • Brian Kardell: I believe you would recognize him from the very short list of guys who funded Ladybird?
  • Oliver Medhurst: Yeah. That's him.
  • Eric Meyer: Yep. Yep. And how long is the funding for, or is it sort of open-ended?
  • Oliver Medhurst: It's definitely a long-term thing.
  • Eric Meyer: Wow, that's really cool. Yeah, this is your full-time job now like you said. It's like 40 hours a week or more as is too often the case for us in this industry, just concentrating on doing this. That's really interesting. How did you get connected with Chris? Was there a submission process or somebody nominated you or how does that work?
  • Oliver Medhurst: He reached out to me actually. I guess since I've been kind of developing it in public in a way, it's been open source since day one, and I've been tweeting about progress as conformance increases and I finished new features.
  • Eric Meyer: Wow, that's really cool that he's sort of got his finger on the pulse as it were, looking around and seeing what people are working on and what could use funding. Yeah, that's really neat. So what are your plans? Obviously your long-term plan is to keep working on this, but sort of at a more detailed level, what are you thinking of doing next and what's your roadmap as it were?
  • Oliver Medhurst: Yeah, I think definitely the main plan for now is focusing on conformance or trying to get at least most commonly used language features working. I only implemented classes recently this month, and there are some other wider big things which are unsupported, but potentially could be done in the future. Temporal is very cool, but I probably wouldn't focus on it for now at least, since-
  • Eric Meyer: Yeah, no, that's fair.
  • Oliver Medhurst: ... no one really uses it.
  • Eric Meyer: Right. Nobody else has implemented it. And also, the latest word is that whole sections of the specification are being pulled out. The API will be much smaller, so by the time you get around to temporal, it'll probably be a lot easier to implement. But yeah, so classes, does that include private classes out of curiosity?
  • Oliver Medhurst: Yeah, I do have private field support.
  • Eric Meyer: Private field, sorry. Yeah, cool. What would you say is the biggest thing that's sort of missing at the moment? That someone would say, wait, you don't have that? How can I use this?
  • Oliver Medhurst: Oh, I think probably I don't have generators right now, which is probably the next big thing I'll work on, because they can be tricky to deal with that sort of control flow.
  • Eric Meyer: And then the other thing I'm really curious about is what was the thing you implemented that turned out to be either surprisingly easy or surprisingly harder than you thought it would be before you started implementing?
  • Oliver Medhurst: Good question. I think probably the... Because I never supported the prototype chain starting off, I kind of presumed all built-ins are sealed because at least I hope most modern JS nowadays doesn't mess with prototypes. But I did that recently for class of support and it was surprisingly nice to implement.
  • Brian Kardell: I like to mess with prototypes. So it's an interesting thing that I'm curious about. So just mentioned you in the same sort of breath as Ladybird and I see similarities in the projects, in that they're like one person's passion project that is desiring to be something that is a competitor, ultimately. And I don't think in either case, it started out this way. In fact, when we had him on the podcast, I should look up exactly what he said, but I think he said basically like, 'Yeah, Ladybird is never going to be your daily driver. It's not as gold, it's not as aim. Maybe we'll get super complete enough, but that it's not what it's trying to be.' And probably same with you, you were just like, 'I don't know, I want to see if I can do it and how far I can get it, and it's just fun.' Am I right? Did you start off with ambitions to really write something that would be in competition with V8 and-
  • Oliver Medhurst: No, I think you're right. Where I think it started at least for probably a few years, and it will be a research project first and foremost. Where I'm not aiming to compete with V8 or something, but I think it would be nice to have a guess alternative which focuses on different things that does well and worse of.
  • Brian Kardell: But this is what's really very, very interesting to me is that for all of these engines, that's the obvious thing is like, okay, what's the thing that it can do that's a niche that helps it survive and thrive and maybe escape that boundary eventually because it gets attention and funding and it can grow beyond what it would otherwise. And so for a lot of them, it's like looking for some niche. And a question that's come up a bunch of times, like people ask me on Twitter even. Well, what's the subset... You could say How many test it passes, but do I need temporal? Maybe for a whole bunch of stuff? No. So how many percent do you need and what are the things? And that's a tricky question, right? Because it kind of depends which things and do you think that there is a kind of a sweet spot of the JavaScript language that would just be for whatever reason, really, really nice to use what you have, where you don't need anywhere close to 100% conformance. Where do you think there is a sweet spot? Is it 50%, 40%, 80%?
  • Oliver Medhurst: Yeah. I think my aim right now is at least around 50%. Definitely. Because there's some stuff like Hermes, which is I guess Meta's JS engine for React Native and stuff like that. But they have around 50% where once you get to that point, you can just use the ability to transpile, which kind of takes the weight off it.
  • Brian Kardell: Oh, interesting. So if you get ES 3.1 compliance, then there's polyfills and transpilers pretty much from modern to all the way down. I mean it wouldn't be fun, but you could do it.
  • Oliver Medhurst: In theory. I wouldn't want rely on it or use it long term, but in terms of if someone really wanted to use Porffor and it was blocked by not having a proposal, a ES 2024 feature, I think it's definitely interesting to offer an option.
  • Brian Kardell: So you're going to have a Porffor plugin or something?
  • Oliver Medhurst: Yeah, I guess potentially. We'll see how it goes.
  • Brian Kardell: I had to go look this up while you were talking because when we were talking about, oh, so it is a binary. So in May 1992, someone sent an email to WWTalk, which is the mailing list set up to discuss the World Wide Web, which was in its very infancy. And he said, 'I would like to know whether anybody has extended the World Wide Web such that it is possible to start arbitrary programs by hitting a button in a World Wide Web browser.' And Tim Berners-Lee replied to this and he said in the beginning of 1992, 'Very good question. The problem is that of a programming language, you need something really powerful, but at the same time, ubiquitous. Remember, a facet of the web is universal readership. There is no universal interpreted program language, but there are some close tries, lists, shell scripts. You also need something which can run in a very safe mode to prevent virus attacks. It should be public domain. A pre-compiled standard binary form would be really cool too. Sadly, it isn't there yet.' Yeah, it's almost here, I guess apparently.
  • Oliver Medhurst: Yeah, I think Wasm is definitely very interesting. I don't think it will replace JavaScript or anything, but I think in some use cases it's definitely very appealing.
  • Brian Kardell: Yeah, it's interesting. It can speed up things in the language itself, but it can't necessarily help with the things that aren't in the language. So let's say that you have your Angular program or whatever, you have your Angular website, Porffor is not going to so much help all the stuff that's in the browser doing the... It's not going to make your fetch go faster. It's not going to make your DOM manipulations go faster, right?
  • Oliver Medhurst: Yeah. At least for now, I didn't really envision people shipping Porffor binaries like free websites. I don't think that, at least unless it suddenly becomes very appealing, it's not really a goal.
  • Brian Kardell: But because of the way that some things like Preact or React work where they do all the work not in the DOM tree and then build a patch set more or less, do you think that it could improve the performance of that because they're just pure functions?
  • Oliver Medhurst: Yeah, potentially.
  • Brian Kardell: Have you tried anything like that? I'm curious.
  • Oliver Medhurst: I haven't tried anything touching the DOM, but I have done some experiments with just comparing the performance of a random Fibonacci functional saying like simple benchmark, but just running the JS version and running the Porffor compiled version. And it is interesting to see the different overheads.
  • Brian Kardell: Yeah. Yeah. I know you have this playground on your website. Maybe we can mouth blog what we're looking at here. It's like there's a dropdown that has a number of sample programs, like small sample programs, so sum of digits, factorial, prime numbers, Fibonacci. And then there's a second that is a dropdown that has parsers. And then a third with a target and the targets are Wasm and C. Then there's an output window on the right, at least on a desktop, I'm not sure on a mobile, but it's on the right. And what is on the right is the Wasm, right?
  • Oliver Medhurst: Yeah. It's like the text format decompiled.
  • Brian Kardell: Right. Yeah. So yeah, what is the deal with the parser?
  • Oliver Medhurst: Yeah, so I guess to mention the project is primarily from scratch, not really for any particular reason, but I guess to have full control of everything. But the one part which isn't made by me is the parser, the JavaScript, and I intentionally made it so you can choose from different ones available because they all output a standard abstracts in TypeScript format. So each one has its own pros and cons. Like PayPal's parser supports TypeScript, for example, and some of the more niche upcoming TC-39 proposals. But it's also quite slow, so there's other ones you can use if you're not using that stuff and like parsing is being slow for some reason.
  • Eric Meyer: Cool. I have really just one more question from my side. What's your dream of where you would see Porffor used, or in what ways it would be used when you're done with-
  • Oliver Medhurst: Yeah, I guess-
  • Eric Meyer: ... implementing it?
  • Oliver Medhurst: ... being super dreamful, well, like existing CLI apps for example, something like Webpack, being able to theoretically just dump that into Porffor and it just making a relatively small fast binary as if it was just CLI or something, would be very cool, but also very far out.
  • Eric Meyer: No, that's fair. But if you could wave the magic wand and it's just all done, which wouldn't be fun, obviously you wouldn't learn anything from that. But if you come back from 10 years in the future after it's all done or 20 years or however long it's going to take you, what would you want them to say? And say things like, well, Webpack now is tiny and really fast because it gets run through Porffor ahead of time. Anything else?
  • Oliver Medhurst: I don't know if you've seen Amazon have been making a JS runtime called LLRT, which is focused on the AWS Lambda stuff? It doesn't have a just-in-time compiler, and it only interprets to try and make it start really fast. Because if it's just a script which fetches a text file and then returns it, you don't really need a just-in-time compiler for that since there's barely any actual JavaScript. I think stuff like that is very interesting since you have one known JavaScript file, which will be deployed to the edge or to the 'cloud', end quotes. So I think potentially if you deployed by compiling it ahead of time, it should be hopefully very fast, but also basically no startup costs. You're not spinning up a JIT compiler or an entire runtime every time.
  • Brian Kardell: Yeah, I can imagine that being a really interesting niche actually. You could get perhaps sponsorship from CloudFlare or one of those to put it in workers and their workers at the edge.
  • Oliver Medhurst: Yeah, it's a very interesting opportunity, I think.
  • Brian Kardell: Yeah, definitely. And I can see it. Even the ones that are out there have limits already and they have some custom APIs. Which is actually another thing that I think is kind of interesting about yours, I want to ask you, do you belong to the WinterCG or will you join the WinterCG with this? Because I see that you do also have things that you support that aren't just JavaScript, right? You have TypeScript support.
  • Oliver Medhurst: Yeah, I've actually done a bit of WinterCG. It's participation in my free time already, and I'm definitely interested in doing more with this.
  • Brian Kardell: I see you also have sort of some proposals that you liked that are very early stage, but you still implemented them because you think they're cool and fun, right?
  • Oliver Medhurst: Yeah. I started a I call it CLA API proposal there, which was adopted a while ago, which is because basically every existing server-side runtime basically just implements Node's APIs, and that's kind of a proposal to try and make an actual standard. It's like nodes API is nice, but it's not a standard. There's zero specification.
  • Brian Kardell: Yeah. So I have just one last question. You have in your GitHub, you have some to-dos and some things that you support and things like that. I think that you support some things that are only proposals, like math.clamp, a bunch of math extensions.
  • Oliver Medhurst: Yeah.
  • Brian Kardell: Are they behind a flag or something, or are they just like-
  • Oliver Medhurst: They aren't, but I could do that. My thinking right now is I don't have much behind flags because it's an unstable project. I guess there isn't much use to that, but I probably will if it becomes stable in the future. Yeah, I think it is nice. I guess owing to the fact that it's written in JavaScript, it's pretty nice to just tinker around, but some proposals might just be like five lines of JS.
  • Brian Kardell: Right. Okay. Eric, do you have anything else?
  • Eric Meyer: No, just thank you so much for coming to talk to us, Oliver, this was really enlightening.
  • Brian Kardell: Yeah. Thanks so much for joining us.
  • Oliver Medhurst: Thanks.