LaurieWired's Avatar

LaurieWired

@lauriewired.bsky.social

researcher @google; serial complexity unpacker ex @ msft & aerospace

3,629 Followers  |  1 Following  |  831 Posts  |  Joined: 20.11.2024  |  2.5238

Latest posts by lauriewired.bsky.social on Bluesky

Programming...Like a Fighter Pilot
YouTube video by LaurieWired Programming...Like a Fighter Pilot

Full Video:
www.youtube.com/watch?v=Gv4s...

03.12.2025 19:33 β€” πŸ‘ 32    πŸ” 2    πŸ’¬ 1    πŸ“Œ 1
Video thumbnail

This...is Programming Like a Fighter Pilot.

A single unhandled exception destroyed a $500 million rocket in seconds.

The F-35 wasn't going to make the same mistake.

By carefully slicing C++, engineers created one of the strictest coding standards ever written.

03.12.2025 19:33 β€” πŸ‘ 57    πŸ” 6    πŸ’¬ 3    πŸ“Œ 0

Unfortunately, there’s not a ton of info out there on IPv5 outside of the official RFCs, but it’s an interesting look into an alternative internet:


www.rfc-editor.org/ien/ien119.txt

19.11.2025 21:33 β€” πŸ‘ 27    πŸ” 0    πŸ’¬ 1    πŸ“Œ 0
Post image

The problem with IPv5 is that *every* router would have to hold the hard state of *every* stream.



Rather than make routers more and more powerful, it was actually cheaper to just…make the internet 1000x faster.

19.11.2025 21:33 β€” πŸ‘ 23    πŸ” 1    πŸ’¬ 1    πŸ“Œ 0
Post image Post image

ST and ST2 (IPv5) let you punch what was called a β€œHard State” into routers.


This reserved a dedicated virtual circuit guaranteeing a specific amount of bandwidth.


The researchers even envisioned video call use! Way ahead of its time, but also a memory hog.

19.11.2025 21:33 β€” πŸ‘ 17    πŸ” 0    πŸ’¬ 1    πŸ“Œ 0
Post image

The early internet sucked for voice streaming.



Packet Switching was designed for resilience during nuclear war; not smooth, continuous transmissions.



The question was, how do you make a fundamentally distributed network act more like a stable phone line?

19.11.2025 21:33 β€” πŸ‘ 17    πŸ” 0    πŸ’¬ 1    πŸ“Œ 0
Post image Post image

Everyone’s heard of IPv4 and IPv6.



I bet you don’t know about IPv5.



Designed in the late 70s, it was an experimental protocol by MIT’s Lincoln Labs for real-time streaming.



Basically, Zoom before Zoom existed...but for defense:

19.11.2025 21:33 β€” πŸ‘ 52    πŸ” 1    πŸ’¬ 1    πŸ“Œ 0
Generic and overused logos - Avoid it!

Unfortunately, the website is now dead, but there’s a great overview of ~2010 era generic logo tropes here on webarchive.

It’s a fun read:
web.archive.org/web/20140625...

17.11.2025 22:32 β€” πŸ‘ 19    πŸ” 2    πŸ’¬ 2    πŸ“Œ 0
Post image

What’s interesting is the method is effective for a below-average company.



By not standing out, you’re able to sort of β€œleech” off of the shared credibility / trustworthiness of familiar symbols.



Of course, it’s a terrible idea if you’re trying to stand out from the crowd.

17.11.2025 22:32 β€” πŸ‘ 21    πŸ” 0    πŸ’¬ 1    πŸ“Œ 0
Post image Post image

There are entire classes of logos that look professional, familiar, and entirely unoriginal.


Finance Firm? Have a growth line.


Tech? Something spherical.

Law Office? Your acronym better be in boxes.

17.11.2025 22:32 β€” πŸ‘ 38    πŸ” 2    πŸ’¬ 2    πŸ“Œ 0
Post image Post image

27 years later, Holt finally got to release his article!



Unfortunately, the majority of the world had already accepted the Intel 4004 as the β€œfirst microprocessor”, hence the confusion.



You can read Holt’s original (now declassified) 1971 paper here:

firstmicroprocessor.com/wp-content/u...

13.11.2025 20:11 β€” πŸ‘ 25    πŸ” 1    πŸ’¬ 1    πŸ“Œ 0
Post image Post image

Ray Holt, the lead designer of the F14 computer, wanted to publish an article about the chip in Computer Design magazine.



1971 - Denied Publication, US Navy Classified.

1985 - Tried again, denied again. Still Classified.

1997 - Examined, cleared for public release 1998.

13.11.2025 20:11 β€” πŸ‘ 13    πŸ” 0    πŸ’¬ 1    πŸ“Œ 0
Post image Post image

The F14 used variable-sweep wings.

With the performance envelope the Navy wanted, humans wouldn’t be able to activate it fast enough…much less do the math in their head! 


A custom air data computer was created, doing polynomial-style calculations on sensor input.

13.11.2025 20:11 β€” πŸ‘ 14    πŸ” 0    πŸ’¬ 1    πŸ“Œ 0
Post image Post image

The world’s first microprocessor is *NOT* from Intel.



But you won’t find it in many textbooks.



It was a secret only declassified in 1998; for good reason. 



The Garrett AiResearch F14 Air Data Computer was 8x faster than the Intel 4004, and a year earlier!

13.11.2025 20:11 β€” πŸ‘ 62    πŸ” 14    πŸ’¬ 2    πŸ“Œ 1

well, at least you know a human (me) wrote it lol

12.11.2025 17:00 β€” πŸ‘ 1    πŸ” 0    πŸ’¬ 0    πŸ“Œ 0
Post image

You might think it was used in Mathematics, but it’s technically a different symbol.



(you’re supposed to use β€œset minus” for math, but no one does)



Next time you hit backslash, just think, you’re using the youngest punctuation character!

11.11.2025 21:29 β€” πŸ‘ 21    πŸ” 0    πŸ’¬ 2    πŸ“Œ 0
Post image

Before the IBM standardization, it get’s murky.



There’s a German teletype machine with the backslash symbol from 1937…but no one really knows what it was used for.

11.11.2025 21:29 β€” πŸ‘ 19    πŸ” 1    πŸ’¬ 4    πŸ“Œ 1
Post image

Backslash marks were popularized by the IBM standards committee in the 1960s, which got rolled into ASCII.

Programmers took a liking to the symbol; quickly adopting it as the standard escape character.


*Forward* slash existed in the 18th century by comparison.

11.11.2025 21:29 β€” πŸ‘ 19    πŸ” 3    πŸ’¬ 1    πŸ“Œ 0
Post image Post image

You might be thinking to yourself, what about Brackets? Curly Braces? 


Nope, not even close. Brackets have been used since the 1500s.

Tilde? Still wrong, used by medieval scribes from 1086 AD.



Backslash is a *bizzare* symbol with unsolved origins.

11.11.2025 21:29 β€” πŸ‘ 8    πŸ” 0    πŸ’¬ 3    πŸ“Œ 0
Post image Post image

Take a look at your keyboard.



See the backslash key?



It’s the *only* punctuation character (not a glyph!) created in the computer age.



Just about every typographic symbol on your keyboard is centuries old.

11.11.2025 21:29 β€” πŸ‘ 61    πŸ” 5    πŸ’¬ 2    πŸ“Œ 0
Preview
Linux in a Pixel Shader - A RISC-V Emulator for VRChat Linux in a Pixel Shader - A RISC-V Emulator for VRChat

The post and story itself is a goldmine, highly encourage you to go read _pi_'s blog on the topic. 



blog.pimaker.at/texts/rvc1/

10.11.2025 21:44 β€” πŸ‘ 25    πŸ” 0    πŸ’¬ 0    πŸ“Œ 0
Post image

Tick between frames fast enough, and you get a (somewhat useable) CPU.



About ~250 kilohertz on a 2080Ti.



Not much, but enough to run Linux!

What I love most is that you can visually β€œsee” the system state at any point just by viewing the texture itself.

10.11.2025 21:44 β€” πŸ‘ 19    πŸ” 0    πŸ’¬ 1    πŸ“Œ 1
Post image Post image

By abusing the heck out of shader logic, you can do some funny things.



To run linux in a shader, you first need a (simulated) CPU.



Of course, someone took it to the logical extreme; and emulated RISC-V logic in HLSL.


~64MiB of β€œRam” stored as a texture.

10.11.2025 21:44 β€” πŸ‘ 20    πŸ” 0    πŸ’¬ 2    πŸ“Œ 0
Post image

VRChat allows users to embed custom fragment shaders within worlds.



Of course, you don’t just get to run arbitrary C code wherever you want; that would be an insane security risk.



But, you *do* have textures. Textures that can hold state.

10.11.2025 21:44 β€” πŸ‘ 13    πŸ” 0    πŸ’¬ 2    πŸ“Œ 0
Post image

Shader systems are ridiculously powerful if you’re clever enough. 



Most people use them to create visual effects. You know what’s cooler?

Running Linux.

Inside an emulated RISC-V CPU. Inside a pixel shader. Inside of VRChat...

10.11.2025 21:44 β€” πŸ‘ 121    πŸ” 25    πŸ’¬ 4    πŸ“Œ 2
Post image

It’s a really fun story, but unfortunately Teramac was a bit ahead of its time.


Here’s one of the better articles about it:
fab.cba.mit.edu/classes/862....

05.11.2025 18:28 β€” πŸ‘ 15    πŸ” 0    πŸ’¬ 0    πŸ“Œ 0
Post image Post image

Test the path, localize the bad resource, blacklist it, compile around it.



Teramac didn’t just sit idle either!

They mapped MRI data of brain arteries, played with volume rendering (Cube-4), and ran a number of *actually useful* workloads after they proved the utility.

05.11.2025 18:28 β€” πŸ‘ 15    πŸ” 0    πŸ’¬ 2    πŸ“Œ 0
Post image Post image

75% of the FPGAs in Teramac would normally be considered too faulty to use. Scrapped.


By intentionally overbuilding the interconnects; well beyond what was sane, defect tolerance was (theoretically) high.

The first workload thus needed to create a "defect database".

05.11.2025 18:28 β€” πŸ‘ 13    πŸ” 0    πŸ’¬ 2    πŸ“Œ 0
Post image

The team expected that bleeding edge silicon would likely have much higher defect rates.


There was huge pressure to reduce yield risk; improving software reconfiguration could change the industry.



The real magic was in the interconnect.

05.11.2025 18:28 β€” πŸ‘ 12    πŸ” 0    πŸ’¬ 1    πŸ“Œ 0
Post image Post image

HP Labs once built a broken supercomputer…on purpose.


Teramac had over 220,000 Hardware Defects.

The question was; can you make a reliable computer out of *known* bad parts?


It was a phenomenal software problem to route around the faults:

05.11.2025 18:28 β€” πŸ‘ 54    πŸ” 2    πŸ’¬ 1    πŸ“Œ 0

@lauriewired is following 1 prominent accounts