

Why? MATLAB is pretty dense normally, and most MATLAB code is hacky scripts that wouldn’t bother with “boilerplate” anyway.


Why? MATLAB is pretty dense normally, and most MATLAB code is hacky scripts that wouldn’t bother with “boilerplate” anyway.
Oh that reminds me. I wouldn’t recommend PIC in the 21st century but there’s a really cool project called BIO that is an open source alternative to Raspberry’s PIO (programmable IO). It’s RV32-E with custom x16-31 registers that control the pins directly. Very neat idea.
It’s by Bunnies Huang and he talks about it in this talk about Xous.
The hardware is (or will be) here: https://www.crowdsupply.com/baochip
May be a bit hardcore for a beginner though.
Yes it has definitely changed. Before AI, writing code strongly indicated that the author had thought about the problem and put effort into solving it. Of course they could have still done it wrong but a) the chances are much higher with AI, and b) they’re using up your time without spending any of theirs which breaks the social contract.


Yeah I think just counting fully unique lines is going to really capture the repetitiveness of a language. I think you’d get more accurate results just asking people using pairwise ranking.


They wanted me to make some changes and with the normal workflow that’s just git commit and git push. With git send-email I have no fucking idea and it got beyond the point where I had enough cared enough to fight the process.
For bare metal definitely get a microcontroller and do some fun electronics project.
Easiest to get into is Arduino, but don’t stick with that because its only redeeming feature is that it’s easy to get into. The IDE sucks, the build system sucks, the APIs really suck, and the code quality is very low (probably because it’s easy to get into so you get a lot of inexperienced people doing stuff).
After Arduino I would recommend either going to the Nordic nRF5x series - you can do some cool Bluetooth stuff, or even make you your own radio protocol since the radio peripheral is fully documented… Or ESP32 with Rust and Embassy is probably the most modern and slick way to do microcontrollers.
It does require learning Rust but Rust is really really good so you should do that anyway.
There are some extremely good videos on YouTube about that: https://youtube.com/@therustybits
I would probably still start with Arduino though since you know C. Just don’t stay there for too long.


Yeah it’s mad. Tbh I don’t think GitHub PRs are the best workflow, but I absolutely know that git send-email is the worst. I tried to use it once to contribute to OpenSBI, which inexplicably also insists on it. Suffice it to say my patch was never merged…


… if you have a super janky patch file workflow.
If you are using Git like normal people do this can’t happen.
This is just straight up “ChatGPT write me an article about merge vs rebase”.
It’s also missing any discussion of squashing, CI, git blame, git bisect etc.
You have misunderstood. The is ranting against Clean Code, not clean code.


In my experience a lot of these old projects really go out of their way to dissuade contributions anyway. Lots of naysaying “it’s always been like that”, ancient infrastructure - e.g. insisting on git send-email patches, etc.
Usually the only way it gets resolved is when someone writes a more modern competitor and it starts gaining traction. Suddenly all those improvements that people tried to do and were told were impossible and stupid aren’t such a bad idea after all.
I don’t think that’s the case with Unity but it probably is with things like GCC, sudo, sysvinit, X11, etc.


I remember when this is how browser zoom always worked. It was super janky, everyone hated it and the current “zoom everything” system was seen as a big improvement.
I guess opt-in makes sense. Probably nobody is going to bother though.


They don’t really let you do anything you couldn’t do in Python, they just let you write more elegant code.
Personally I find ML-style languages to be difficult to read. They deliberately leave out a lot of the punctuation that makes code readable leading to code that just looks like a stream of words.
Rust is I think the best option here - it steals most of the good ideas from functional programming but has saner syntax.
Also you seem to be conflating pure languages with functional languages. I also made this mistake because Haskell is probably the best known functional language and it’s also pure… But they’re different things. OCaml is functional and not pure. You can use mutable variables to your heart’s content.
TL:DR learn Rust not Haskell or OCaml.


Given the quality of your average Python code this sounds like a terrible idea.
Honestly this looks like it sits in the useless middle ground between “proper CI that has all the features you expect” and “just write a Python/Deno script or whatever”. I can’t see what you gain.
Also you say “no painful YAML pipelines” but it uses YAML??


TCL & CMake are fully stringly typed. Both pretty terrible languages (though TCL can at least claim to be a clever hack that was taken far too seriously).


It is INT_MIN. Seems like a much more sensible value than 0 IMO.


Try interacting with anything that uses u64 and you’ll be a lot less happy!
Anyway JavaScript does have BigInt so technically you are choosing.
that insanity is how C and Intel handle NaN conversions.
It’s not actually quite as bad as the article says. While it’s UB for C, and it can return garbage. The actual x86 conversion instruction will never return garbage. Unfortunately the value it returns is 0x8000… whereas JS apparently wants 0. And it sets a floating point exception flag, so you still need extra instructions to handle it. Probably not many though.
Also in practice on a modern JS engine it won’t actually need to do this operation very often anyway.


Yeah. I think the smallest number of number types you can reasonably have is two - f64 and arbitrary precision integers types. One of the few good decisions Python made.
To be fair they are definitely improving. It feels pretty incremental at this point though. I think we need one or two fundamental breakthroughs before we’re going to see programmers actually out of jobs. E.g. if they find a way to do real on-line learning, or a way to stop the hallucinations.