> .... why do I see something completely different now?
Guessing fallout from no studies on how to get AI motivated about/interested in/trained on cool side project(s)? (vs. human factor of using side projects as way to learn/master/showcase talent with side effect(s) of generating a 'cool' project(s)[5]).
Alteratively, AI intended as automated assistant. So, AI has shifted the base knowledge/skill set(s) one needs to do 'cool stuff'. aka no emacs/lisp/unix for dna computer for hacker/hobbiest yet.[0][1][2][3][4]
Impliment program functionality in a GNU Jitter/JitterLisp VM[1]. Have program use VM references to do functional computations. aka move program features into a VM, such that program functions/features are now calls to VM. (vs. traditional OS / library calls).
Finding (Note: where data from & terms of use!) : general search engine search on "publically available data for " <insert topics/areas of interest aka computer science>
> ... building my own data processing pipeline from scratch and honestly this is a bit of a pain.
Yup, only thing really changed over years is more ways to access data / volume of data / faster ways to process data.
Bit/byte context somewhat what historically linked to minimum unit for given medium.[0] ISPs generally deal with sending/delivering 'raw' data as bits with no symbolism. vs. networking gear deals with subgroups bits in 'ascii/utf-8 byte sizes' within network data protocol. historically, electrical waves vs. what symbols electrical waves represent.
From TrueNAS/harddrive perspective, bits physical medium, bytes 'data as used by end user'
> From TrueNAS/harddrive perspective, bits physical medium, bytes 'data as used by end user
Appreciate the response. Not sure I follow this last bit. My Q is about the inconsistency of how network data is measured.
TrueNAS measures it's network thruput in bytes. Users have asked for a toggle but he devs are resistant. I was hoping to learn why they really want bytes.
Minimum standard of measurement of buffer size handled by electronics.
Simplified, non-technical: Adding 'bits' would just indicate how many bytes / 8 + bits in last byte of data transferred. Last byte in transfer message would still be processed as a full byte by electronics, even if less than 8 bits. So, no total through put change. aka from electonic path through put standpoint, 33,34,35,36,37,38,39,40 bits is same as a 5 bytes.
Given schema is limited to where uuid v4 usage is relevant/appropriate.
uuid version 7 more appropriate for keys in high-load databaseses and distributed systems.
Issues if need something other than uuid_v4. aka v8,
snowflake_id bit more compact than separate uuid_v4 & timezone
json, "blob" storage, not efficent for/optimized for search/replace operations. Json blob will need to be normalized every time data cached. File system storage with uuid_v7 index less overhead.
Access/search for data within json blob is non-sequential/random, kinda defeating whole purpose of using database. Not efficent way to update json if json larger than original size aka cache coherency issues.
Personally, would look more at side project(s) unrelated to work to figure out areas of interest/possibilities/professional contacts before diving into moving/leaving.
Combine with tromp binary calculus[1][5] to demo 'free variable' as marble loop hole / 'jiggle' factor / statistical variance. (vs. original Lord Kelvin tide predicting machine)
Suppose would be bit tricky to orient pacman, even if for 1d roon pacman[3]. Roon pong version of [4] including 'roon marble matrix video'? Although, robotic pic-n-place / mechanical assembler for roon circuit assembly might make things bit easier. aka hardware implimentation of lisp.
Would be interesting to see an 80's 8bit computer (c64/atari) implimented in roon logic with LLM[2] to see how AI gets the proverbial ball rolling to arrive at a solultion(s).
Marble wrist watch[6] or marble slide rule might be more interesting.[6] (with RTC implimented in roon logic[7])
[0] : PREP : https://klesse.utsa.edu/prep/resources.html