800 is a huge achievement. But I have to admit, around 2011 I had completely given up on the Simpsons. Story and content aside, they did something with the audio. The quality of the voice is so clear that it sounds un-natural. You can see the same effect on many shows around that time. The voice is disconnected from the background music and sfx.
Anyway, they also improved the way the characters are drawn so much that it lost it's crude nature.
The cell layers constraint led to better art. The detail in the background was minimal and the artists world compensate for it by interesting framing and lighting. Go back and watch one fish two fish or the black widower episodes from early seasons - just incredible animation.
You can see the number of lines drawn go up like crazy around season 10 or so, making it feel less realistic. Coincidentally, the writing also started to get worse around this time.
The app didn't work for me. One that was shared right here on HN. I selected 25 miles radius, same ethnicity. Naturally I was matched with a person 700 miles away, of different ethnicity. So we got married... and deleted the app.
We were interviewed as a success story and our faces are plastered on the Internet now. My friends didn't find the same success, I concluded that they didn't know how to date. (wear the right clothes, etiquettes, conversation, navigate ghosting, etc.)
"What if the app could teach you how to do just that?" That's what I asked in our interview. That part was never published.
There’s no need to fear the construction of mass surveillance anymore. It’s already here. We built it one convenience at a time [0]. When I see all my friends with Alexa devices at home, ring cameras, and a million food apps on their phones, it feels like it’s already too late.
The blog that I started ~13 years ago started as 3 .html files. Everything else followed as needed (styling, rss, comments, etc.). If you can get past building it, the next question becomes "What should I write about?" [0]
My answer is usually that you can write whatever you want on your websites. It's yours after all. None of the limitations that exist on third-party platforms exist. You can make all the pages read upside down if you want to.
Thanks for reporting this. I'm assuming you are referring to the RSS feed?
The actual feed is https://idiallo.com/feed.rss in the meanwhile until I figure out the issue
No, they're referring to an error that pops up when you visit a page whose url ends in 'women-in-the-world.html'; you can click okay and still browse the page though :-)
Haha thank you. That went over my head. I dismissed that box without reading the error. But... I can neither confirm nor deny I understand what you are referring to ;)
I've restarted blogging last year, going from a handful of blog post to, publishing consistently. All content gets published on my blog first. I've seen an ~8x increase of traffic. I was affected by zero-clicks from Google's AI overview, but the bulk of my traffic now comes from RSS readers.
>the bulk of my traffic now comes from RSS readers.
I don't think this is correct unless you mean strictly the number of HTTP requests to your web server.
You were the 9th most popular blogger on HN in 2025.[0] Your post says you have about 500 readers via RSS. How can that represent more readers than people who read your posts through HN? I'd guess HN brought you about 1M visitors in 2025 based on the number of your front page posts.
You are right, my statement may be a bit misleading or incomplete. The ~500 readers are not just local rss bots, but they include aggregate RSS bots. For example, I see the feedly reporting ~200 subscribers, newsreader reporting 50 subscribers, feedbin, etc. Each of those only have between 1 to 3 ip addresses. So for each RSS bot, there are an arbitrary number of actual users reading. I can't track those accurately.
However, users can click on an RSS feed article and read it directly on my blog. These have a URL param that tells me they are coming from the feed. When an article isn't on HNs frontpage, the majority of traffic is coming from those feeds.
By the way, thank you for sharing this tool. Very insightful.
These are impressive metrics, are you able to make a living off of your 10M views?
I'm planning to leave my job this year and focus on content, mostly have been considering YouTube, but if blogging can work too, might consider that as well
Not even close to making a living! It does pay for my server though which costs $15 a month. YouTube gives you much more visibility. I'll try to compile the numbers from my single Carbon ad placement and the donations I receive from readers.
But I also don't think I have the process in place to do Blog, YouTube, Podcast and hold a full time job. Yes the job is my source of income.
Yeah I hear you. My understanding is that on youtube you can make ~2k per 1M views with the default ads. I'm hoping that I can be funded by some combination of that and something like patreon/membership/merch. But we will see, it's something I've wanted to do for years and I am getting too old to put off longer.
Im a firm believer that data collected that doesnt have a clear action associated with it is meaningless - and i couldnt think of an action i would take if my traffic goes up or down on my personal blog - but tbh i mainly blog for myself not really to build an audience, so our objectives might differ
There are some actions you can take. For example, when my traffic plummeted, I saw through my logs that search engines were trying to access my search page with questionable queries. That's when I realized I became a spam vector. I gave a better rundown through the link I shared.
Same reason why people have personal projects and share them on GitHub, it's fun to see people using / starring / interacting with your project / blog.
Just an FYI, the data collected to make those conclusion was through the server log (Apache2 in my case). So if you run your own server or VPS, you already have this information.
If you want to count every search engine bot, AI crawler, vulnerability scanner as users then that works, but these days it's basically useless to use these web server logs.
If it makes you feel better, on reddit, I shared my very first blog post about deprecating mysql_* functions in php. As a result, someone said something mean about my mother. I figured the web was full of trolls.
But that wasn't enough. Someone else wrote that my article was useless and I write at a 7th grade level. I turned off the monitor, went for a walk. I decided that blogging wasn't for me. It was time to delete my blog. I was so embarrassed.
When I came back, there was a reply to that comment. It said something like "that's a good thing, 7th grade level writing means we can all understand it easily". And that was enough to keep me going. 13 years so far.
Reddit is now just AI slop, so I don't know if that's an improvement or not over this story. I'm just glad you were able to get over that BS and engage with it all again and kept going! I gave up and never went back in around 2010, but I'm going to try again in 2026.
The problem with environments designed to make interaction low-energy and gamified like Reddit, is that it gathers just the worst people. I've got ~63k karma there, and disengaged some years ago and I can't tell you how much ditching that, twitter and Facebook improved my mental health. There's some great fun to be had there, but it's often the same thing over and over again and increasingly drowned out by utter crap. They've taken multiple actions that have destroyed the sense of community and have become a poster child for ens*tification, unfortunately.
Thanks for sharing. After reading that comment, I realized we should encourage ourselves and others (who are more or less civilized human beings) to be the kind of person who wrote "that's a good thing..." - because fighting trolls is a game with unknown results, but encouraging people works much better. It doesn't always work, though, because sometimes the platform's nature prevents it. Like on Stack Overflow, where commenting on reactions will probably get you downvoted for being off-topic.
I once spoke in favor of remote work (around 2020) and someone here on HN told me to get cancer and die, before it was flagged enough times to get out of the way.
On YouTube, I also sometimes get mean comments, though at least there the automatic moderation catches them so they don't show up publicly and I can shadowban the offenders off the channel easily. None of the content is even controversial, YouTube just attracts a lot of angry people that feel entitled to speak what's on their mind.
I wouldn't publish in an environment where blocking or banning people is difficult. They're not entitled for me to engage with their hateful drivel. My blog also doesn't have comments. At the end of the day, I will say what I want to say.
A while back, I've decided to make time tags dynamic on my website. First of all they have the title tag to show the actual date in UTC. By dynamic I mean, when something is just published, I use relative time that updates in real time. 1 second ago, 2, 3... etc. Then the minutes, then the hours, then daily.
I always get frustrated when I see a 7 months ago, or X years ago, the math is always inconsistent when they round it. So when something is more than 3 days old, I display the actual date.
A special place in hell is reserved for Stack Overflow’s recent redesign, which shows “Over a year ago” both for comments that are 13 months old and for those that are 13 years old.
> I always get frustrated when I see a 7 months ago, or X years ago, the math is always inconsistent when they round it. So when something is more than 3 days old, I display the actual date.
What especially makes me angry is dev tools doing this.
No, Github, Circle CI or Google Console [1] and others. I need to see actual timestamps on commits, PRs, merges, logs etc. not the bullshit "7hrs ago" when I'm trying to find out what broke.
[1] At one point a few years back their log viewer would show this. Someone actually implemented it because showing this is more work than actual proper timestamps.
The way that this is handled on most websites is that you show "X time ago" but you can hover over the time to get the full timestamp. For example, that's how it's handled here on Hacker News and Reddit.
Honestly, the fact that mobile browsers don't provide a way to see the contents of the title attribute is a severe UX failing on the part of the browser developers, not the website developers, who are literally using the attribute as intended.
The relative-time labeling bit HN in the ass this week during its outage (<https://news.ycombinator.com/item?id=46301921>), when hours-old comments were displayed as "n minutes ago", with n ranging from 0 to low-single-digits.
This made identifying the duration of the outage somewhat more difficult.
(HN does display the precise time in a title text for the timestamp which typically appears on hover, though you'd need to know that that's in UTC.)
I honestly don't understand why it's ever useful to die show relative timestamps over absolute ones. It's not hard to look at a date or time and understand how far back it was; it's not even that I can do the math to figure out the relative time, but that the relative version isn't even worth bothering to calculate because the absolute one is just as intuitive. If it's currently February 20XX and I see a timestamp of July 20XX-1, I know how long it's been since then, and I don't care about the number of months. If it's February and I see the timestamp "7 months ago" I don't immediately know it's July without at least doing some small amount of thinking, like "okay, a year before five months from now, so July" (which is especially silly because now I'm having to lean even more into relative times just to be able to get back to the absolute date). Seeing the exact date and also potentially know other pertinent facts like the season ("that picture was from the summer"), holidays ("it must be from the 4th of July barbecue"), etc.
Is there something I'm missing here about why people might prefer relative timestamps? I genuinely can't tell if everyone kind of universally hates them or if this is one of those things where my brain just works differently than a lot of other people.
> Is there something I'm missing here about why people might prefer relative timestamps?
I think most people are uncomfortable parsing timestamps for small-interval differences, e.g. `2025-12-19T16:28:09+00:00` for "31 seconds ago".
For larger intervals, I agree that timestamps are more useful. "1 day ago" is a particular bugbear of mine. One day meaning, 13 hours, or meaning 35 hours? Sometimes that's important!
The original advice when relative timestamps became a thing was to choose based on the activity level of the content. If new content is constantly appearing and older stuff fades out of relevance quickly, then choose relative timestamps. Otherwise, use absolute timestamps.
The worst is inconsistency, and the best is sometimes both (when presented in a discoverable and convenient way -- hover text used to be that way, but this degrades on mobile).
To clarify, I don't mean to literally imply an exact timestamp format. Showing something like "December 19, 2025 4:28 PM" or "19 December 2025 16:28" seems strictly better to me than "31 seconds ago" because it doesn't either become inaccurate quickly or require having the page update in real-time.
Posts for one thing like yours an hour ago as of this post. Sometimes people want to see how old content is at a glance. Like how long ago did someone log in?
I guess I just don't find the relative timestamp to be a more intuitive way of seeing that. If I see today's date and a time this morning, I don't need to "translate" that into an exact number of hours because "12 hours ago" isn't more meaningful to me than "this morning", an "2 minutes ago" is likely going to be wrong quickly (or require a technical measure to keep accurate, and given that the relative timestamp already arguably is more work to implement, that's now two extra things added to try to solve a problem that I don't really understand to exist in the first place).
Having thought through a bunch of different orders of magnitude of time (time in the past measured in seconds, minutes, hours, days, weeks, and years), I'm confident that I'd personally find the actual date and time to be more intuitive in every single one of them. What I'm not confident in is whether that would be the case for everyone else or not. I don't think there would be anything wrong with someone feeling differently than me, and if it turns out I'm in the minority, I wouldn't have any trouble accepting it, but it feels so fundamentally disconnected with the way I think about things that I have trouble conceiving of it other than as a hypothetical.
I disagree. When the tool promises to do something, you end up trusting it to do the thing.
When Tesla says their car is self driving, people trust them to self drive. Yes, you can blame the user for believing, but that's exactly what they were promised.
> Why didn't the lawyer who used ChatGPT to draft legal briefs verify the case citations before presenting them to a judge? Why are developers raising issues on projects like cURL using LLMs, but not verifying the generated code before pushing a Pull Request? Why are students using AI to write their essays, yet submitting the result without a single read-through? They are all using LLMs as their time-saving strategy. [0]
It's not laziness, its the feature we were promised. We can't keep saying everyone is holding it wrong.
Very well put. You're promised Artificial Super Intelligence and shown a super cherry-picked promo and instead get an agent that can't hold its drool and needs constant hand-holding... it can't be both things at the same time, so... which is it?
Anyway, they also improved the way the characters are drawn so much that it lost it's crude nature.
reply