Feels slower than GPT-5 and I understood it that medium should be a lot faster than high but for me it's almost the same , so I don't see a reason preferring medium.
I have seen fluctuations in token/sec. Early yesterday, roughly equivalent to none Codex GPT-5 (this branding ...), late yesterday I had a severe drop off in token/sec. Today, it seems to have improved again and with the lowered amount of unnecessary/rambling token output, GPT-5-Codex (Medium) seems faster overall. LLM rollouts always have this back and forth in token/sec, especially in the first few days.
"it says here that you've been using technology X for a few years. What is the most frustrating thing about X? If you had the resources needed what would you do to fix it?"
Isn't the idea of preparing for software development interviews ridiculous?Instead of improving my algorithms skills to become a better developer I find myself memorizing a ton of problems just so I can answer similar ones during interviews. It feels like I'm preparing for the SATs again.
For a while there in the .5 to .8 days performance was pretty iffy, and the documentation was lacking (as was expected for a library pre-1.0), but it seemed that was when a lot of bloggers picked it up, compared it to react/angular, then dropped it.
And back then when I was trying it out there wasn't any easy way to integrate webpack (or something like it) which made it difficult as I've gotten very used to a good bundler/compiler.
Yeah... I remember those days. The first projects I built with it were 0.5. The migration to 0.8 was... painful. But given the huge performance improvements it was worth it.
As you said, you kinda just have to expect a little pain pre-1.0. But the time investment in learning and using it has been worth it in my opinion.
I've found it's still pretty difficult to integrate with Webpack because pretty much all of the components use bower instead of npm. I've been having success with https://github.com/aitoroses/vulcanize-loader although re-running vulcanize every time components change is not fast, and I can't get it to work with Hot Module Reloading. Maybe the Webpack story will get better with Polymer 2.0
I think it's mostly due to browser support and developer awareness of Web Components. Chrome has native support (of course) for the four main technologies needed for Web Components, but other browsers are starting to catch on. In the meantime you need a few Polyfills to get full support, which is not ideal.
Sure, but look at it this way - with other solutions like lets say React their virtual dom is your "polyfill".
Even with polyfills polymer is MUCH smaller than react core ;-) I don't see it as a problem as I've created elements that worked in IE10+ with it.
Oh I definitely agree with you. We've worked on projects with Angular 1/2, React/Redux, and Polymer and Polymer is, for me personally at least, the nicest to work with. I have grown to really like TypeScript since working in React land on recent projects but I'm excited because Polymer 2.0 will be [easily] TS compatible. Best of both worlds, I think.
I'm not a fan of CoffeeScript but do they really need to lug the huge amount of dependencies babel brings along with it? Why not target ES5 with some ES2015 sprinkles in depending on compatibility and slowly work towards ES6?
I'm not sure how you slowly 'work towards' ES6. Unless you decide to only support bleeding edge browsers, you cannot use ES6 or ES7 features in production applications.
The only way we can use them is through transpiling. This will be the case for at least another three or four years.
> I'm not sure how you slowly 'work towards' ES6. Unless you decide to only support bleeding edge browsers, you cannot use ES6 or ES7 features in production applications.
Not really.
First it depends on your target audience. How new are their web browsers? Where are you developing your web applications? Some projects I've worked on have web applications that run in only a single environment so those are easy. Others I've had highly technical websites that the vast, vast majority went to using newer web browsers.
Second, pick and choose your ES2015 features. Honestly you don't need ES2015 at all but if you want to use it then some items, like let and const, are supported all over the place but others, like fat arrows, not as much.
> The only way we can use them is through transpiling. This will be the case for at least another three or four years.
Again all depends on your target audience. Maybe? Then again I've never been the biggest fan of transpiling one language into another just to access a couple of minor feature additions. It's a huge amount of dependencies, build time, etc all to support a few minor things.
Well, as others have pointed out, sqlite is simple and easy to use, that way we can work on the core functions of the app rather than having to worry about the database.
Also, as far as Go language is concerned, we do not really have to worry about the underlying database, the code, as others have rightly pointed out, is totally reusable
That seems a bit overkill. With telling a user how to use mysql/postgres/&c., there will also be questions about setting up these systems, so it seems that "just use sqlite" is a valid path if you're just wanting to go over basic usage.
The code is typically identical/reusable, it's all using the same stdlib database package. The only two lines that would change would be the import (import a mysql or postgres driver instead of sqlite) and the database handle - `db, err := sql.Open("sqlite3", "./newtask.db")` would become something like `db, err := sql.Open("postgres", "user=pqgotest dbname=pqgotest sslmode=verify-full")`. One of the many bright spots of Go.
Huh? Why? Sqlite is closer to postgres/MySQL is you're looking for a reference base and actually sqlite is a pretty awesome database. Bolt DB is much more niche.