Hacker Newsnew | past | comments | ask | show | jobs | submit | cleversoap's commentslogin

> Is the "write once run everywhere" paradigm a realistic goal

A part of me sees react-native adding another option here - "write once, tailor anywhere"

While it's always been possible to have common core logic and then tweak your frontend for each platform as a separate project to get a slightly closer to actual native experience, react-native makes it significantly easier to do so and I think it's that one+ standard deviation away ease that makes all the difference.

I don't think we'll see a lot of successful react-native cross platform that goes for the one-size-fits-all approach but rather, with the same (probably even less) effort, we'll see react-native apps that target the UX of a platform rather than just the UI elements.


Tabletop games fill the niche left by typical LAN party / split-screen games as more and more of them move to only allowing internet connected matchmaking rather than some local network equipment / friends that won't look on your half of the TV.


> Kit modules are compiled libraries, while CCAN distributes source code only.

Could you expand on this? I assume pulling the source, compiling locally, and then adding the headers and libs to a project local include or link path. Is it individual directories for each dependency or do they get mixed together?

How are you handling macros? Do you have some preset defines you're passing in for different platforms?

I think it's an interesting idea and definitely has more polish than ccan. I don't know if I'd ever give up fine grained control in production but for small personal projects this would be ideal.


You're correct. When fetched, modules are given their own directory and compiled locally (one directory per module/dependency). New projects are then scanned for dependencies, linked against pre-compiled libraries, and necessary headers are added to the search path.

In terms of presets, I have some [basic] defaults included myself, but more can be added in an optional config file, along with arbitrary compiler flags.

Kit is optimized for simple projects, where crazy build scripts aren't needed. So, yeah, for fine-grained control, I'd stick to existing tools. However, this may be more pleasant than alternatives for personal projects, as you mention.


I'm not even kidding, we are probably just as lazy except it manifests as "I can't be arsed driving" so we take the simplest route and walk for short distances.


I too was confused but then I realised this isn't meant for fun hacky 3D-mapping. It's meant to sell stuff for the home and I think that's unfortunately going to limit the reach of this technology. The only third party type of business that might be able to make use of this is a real estate company for obvious reasons. I guess they have a public API that can pull the data but even then that requires somebody with the camera to upload your house.

I highly doubt IKEA or any other furniture store is going to bring their camera out to me just so I can then digitally place a few sofas and I'm not going to pay $500 just to... what... save me a trip to the returns department? If I bought it what happens when I'm done decorating? Does my camera become a useless $500 paperweight?

It's cool that they allow you to have an OBJ (UV maps and textures too?) - I assume that doesn't include whatever proprietary meta-data gets attached to denote walls, floors, et al. If I could get the mesh with just the camera I would buy one of these in a heartbeat but otherwise I don't see this being used by any consumer apart from the absurdly obsessive renovator.


Right? Except it's not a $500 paperweight. It's a $4,500 paperweight with an additional $500 a year overhead.

Depending on how effective Project Tango is, this technology might be short lived. Maybe there's some manual labor that has to happen on their end before your files are ready? Otherwise I just don't understand the cost. Storage and bandwidth couldn't possibly cost $50/mo.

Sometimes I really do lament the fact that everyone decided that an ongoing subscription model is the way to go for everything.


A lot of engines do this to some extent (eg. OGRE3D, Irrlicht, et al.) but there are pitfalls there as well. While jquery does a good job of shimming in hacks for older browsers the world of graphics programming isn't really that flexible and even newer drivers can still be a total grab bag of what they support - think of it like this: you have the latest version of Firefox and you can support everything but rendering text in italics. How would you shim that in? Can you? What if it tells you it supports it but it actually slants text the other way? What if it renders correctly but doing it the normal way takes an hour but there's a hack for only this operating system and version of Gecko that works. Also there really isn't a manual and StackOverflow is full of questions that contain all your search terms but are actually for a completely unrelated matter.

So you drop it from the common interface that your abstraction presents because it's just not consistent enough....

There are also a TONNE of shader abstraction languages that transcribe to HLSL or GLSL on the fly for the same reason.

tldr; Engine developers stare into the abyss and somehow the abyss gives back a projection matrix.


> how people made games

In my experience this is the common question amongst everybody (myself included) that started programming young. Some adult somewhere (how sad that I don't remember) took me seriously enough and gave me some books so I started writing in C, BASIC, and assembly and I was too stupid to realise that it was supposed to be too hard for a 9 year old.

Today (as in literally this moment in time) I am still writing C for games.


CMake is interesting because the CMakefiles are just a different type of editable/configurable build script (ie. slightly different Makefiles). It's still an interpreted step between build instructions and the output that you actually use to build which isn't meant to be edited and therefore obfuscates part of the process... so autoconf. This is both a strength and weakness in that there are so many external modules for CMake (eg. the various FindX) that they work great most of the time but when they break they are more difficult to fix than an atomic change to the actual compiler flags in a Makefile.

I don't think there will ever be a standard build tool, simply shunts to get the needed functionality from others. Eventually all build tools will exist in each other and the only interface will be recursion.


clang clang clang went the compiler!


Like the other projects mentioned, LLVM was open source before Apple starting leading its development.


.. But you can be sure that Apple's contributions have taken it far further than it would otherwise be.


But _clang_ wasn't; it didn't exist. Clang isn't just another name for LLVM or something; it's a compiler using LLVM.


I've been with them for quite a while and one thing I've noticed they do differently is that they will upgrade your service periodically while not changing the price. For example, I've gone from 128MB to 512MB of usable memory for my vhost without paying anything but my $9.99 a month. There are other anecdotes (awesome control panel, good support, etc...) that make me think it's unfair to group them with other hosting providers that never do anything but the bare minimum.


I have an account with webfaction and my mysql DB was using more than 256 mb of memory. I immediately got an email saying that I need to lower the usage or upgrade to higher plan.

I was thinking of my next step when I got an email next day that my account has been upgraded to 512 mb memory plan at no cost.

Definitely made the right move by moving from hostgator to webfaction.


Oh, I agree that Webfaction is very very good at shared hosting. It's just that I've learned my lesson and don't plan to be too attached to them.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: