@mattdesl.bsky.social
artist, coder
Really enjoyed this conversation with Tyler Hobbs during his recent book signing event at Unit London, where we talk about generative art & our creative process. π€
Full 1hr+ recording (!!) in the link below:
β unitlondon.com/2025-07-09/i...
JT Nimoy's monoline font from Textension (1999)
Some of Douglas Hofstadter's "Letter Spirit" fonts (1987-1996)
For some super arcane deep cuts, I've ported JT Nimoy's "Textension" vector font (1999) and Douglas Hofstadter's "Letter Spirit" gridfonts (c.1987-1996) to p5.js, now available in my archive of p5-single-line-font-resources: github.com/golanlevin/p...
04.06.2025 12:08 β π 117 π 23 π¬ 4 π 3About Colors θ²γ«γ€γγ¦
kyndinfo.notion.site/About-Colors...
#p5js #math #physics #code #art
What Iβm working on is a bit more homebrew, tailored specifically for a project Iβm working on. The pigment curves are optimized to match physically measured spectral data, and the mixing model is pretty different than spectraljsβ approach.
05.05.2025 10:58 β π 2 π 0 π¬ 0 π 0still imageβ
07.04.2025 18:34 β π 5 π 0 π¬ 0 π 0using an evolutionary algorithm to paint Mona Lisa in 200 rectanglesβ
π§ source code in JS:
github.com/mattdesl/snes
Here is another selection using permutations of three unique pigments instead of two. This is equivalent to uniform sampling within a 3D simplex, i.e. a tetrahedron.
It leads to less purity of any single ink, and is a little lower in overall saturation.
(GIF doesnβt work on BlueSky I guess?)
Generating vibrant palettes with Kubelka-Munk pigment mixing, using 5 primaries (blue, yellow, red, white, black). The routine selects 2 pigments and a random concentration of the two; although it can extend to higher dimensions by sampling the N-dimensional pigment simplex.
19.02.2025 10:57 β π 25 π 1 π¬ 1 π 0Adding another pigment (dimension) is quite easy with a neural network. Now it predicts concentrations for CMY + white + black, allowing for smooth grayscale ramps and giving us a bit of a wider pigment gamut.
04.02.2025 20:17 β π 5 π 0 π¬ 2 π 0A couple things that arenβt clear to me yet, is how Mixbox achieves such vibrant saturation during interpolations, and how they handle black & achromatic ramps specifically. I may be struggling to achieve the same because of my βimaginaryβ spectral coefficients.
03.02.2025 22:13 β π 3 π 0 π¬ 1 π 0I like the idea of a neural net as it's continuous, adapts to arbitrary input, and is fast to load and light on memory. However as you can see, it has a hard time capturing the green of the LUT at the beginning (although it doesnβt exhibit the artifacts with the saturated purple input).
03.02.2025 22:13 β π 3 π 0 π¬ 1 π 0Research/experiments building an OSS implementation of practical and real-time Kubelka-Munk pigment mixing. Not yet as good as Mixbox, but getting closer. Comparing LUT (32x32x32 stored in PNG) vs a small neural net (2 hidden layers, 16 neurons).
03.02.2025 22:13 β π 48 π 3 π¬ 2 π 0I think certain dev tools could get away with itβlike Vite, or my own canvas-sketch. I think the UX of an electron app may be better for average user but also hurts experienced devs; not just install time but also lack of browser diversity which is crucial if building for the web.
25.01.2025 08:48 β π 3 π 0 π¬ 0 π 0Looks amazing Scott. π
25.01.2025 08:40 β π 1 π 0 π¬ 0 π 0It's closer to Mixbox's implementation; using four primary pigments (each with a K and S curve), and then using numerical optimization to find the best concentration of pigments for a given OKLab input color. It's only running in Python at the moment, but LUT is possible.
github.com/scrtwpns/mix...
a late #genuaryβ"gradients only"
working on an open source pigment mixing library, based on Kubelka-Munk theory.
left: before KM mixing
right: after KM mixing
It just sets the background fill to none at the moment. It would be nice to have more options like per layer exports though.
20.12.2024 13:11 β π 1 π 0 π¬ 0 π 0Final few hours to mint a Bitframes before the crowdfund closes and edition size is locked. 100% of net proceeds are being directed to a documentary on the history of generative art. π½οΈ
Closes today at 5PM GMT (UK time).
β bitframes.io
Added some plotter and high-res print tools to the open source Bitframes GitHub repo:
Toolsβ
print-bitframes.surge.sh
Codeβ
github.com/mattdesl/bit...
Last week to mint and contribute to the Bitframes crowdfund! 100% of net proceeds are going to the production of a documentary film on the history of generative & computer art. π¬
β bitframes.io
COMPUTER ART IN THE MAINFRAME ERAβ
A ~40 min interview with professor and computer art history scholar Grant D. Taylor that I conducted during R&D for Bitframes.
Listen β bitframes.io/episodes/1
Bitframes - an NFT project / crowdfunding initiative by @mattdesl.bsky.social to produce a documentary on #generativeart
bitframes.io
Bitframes #1603
03.12.2024 08:08 β π 30 π 2 π¬ 0 π 0Abstract image of a square composed of individual black hatched squares with a series of clear squares running diagonally across the matrix which reveal colored squares underneath.
My contribution to @mattdesl.bsky.social awesome Bitframes project. Token ID 1226.
28.11.2024 20:49 β π 15 π 1 π¬ 0 π 0happy to be part of bitframes.io by @mattdesl.bsky.social and support generativefilm.io :)
bitframes #1188 and #1189
Thank you Marcin for supporting the project & film! β€οΈ
28.11.2024 14:06 β π 7 π 0 π¬ 0 π 0Woah ! This is a fantastic output. Nice one. π
28.11.2024 11:10 β π 1 π 0 π¬ 0 π 0Bitframes #1153
(bitframes.io by @mattdesl.bsky.social)