Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> There’s always a lingering question about Google’s long-term commitment to gRPC and protobuf. Will they continue to invest in these open-source projects, or could they pull the plug if priorities shift?

Google could not function as a company without protobuf. It is ingrained deeply into every inch of their stack.

Likewise, gRPC is the main public-facing interface for GCP. It's not going anywhere.



Their commitment to open source, however, might go.

Quite recently Google quietly unshipped an effort to make their protobuf build rules more useful to OSS users of Bazel (see the rules_proto repository). This wasted a huge amount of planning and work that'd gone into the migration.

And the fact that these tools are designed first and foremost for Google use shows up everywhere. Stuff that Google fundamentally doesn't care about but is widely used (eg Ruby) is stagnant.

In this state, it's totally reasonable to reconsider whether these tools are worth building on top of. I personally still believe! But I don't blame people who are skeptical.


> Their commitment to open source, however, might go

Google's OSS contributions are largely correlated to the fact that they could _afford_ to do OSS. When you have the best business model in the world, you can afford X% of your engineering hours focused on OSS. Of course, it's purely not altruistic they also get back a lot in return

However, if due to AI or other advancements, Google's business model takes a hit I wouldn't be surprised that their OSS contributions are the first to go off. Like we saw Google codejam being discontinued in 2022 layoffs

Though if your business outlives Google, gRPC going away might be least of your problems


There was a influential internal white paper about not being a "tech island" that drove open-sourcing. The point was that by having its own tech stack Google would eventually be left behind and have a harder time recruiting.

Not sure if the message is still believed as strongly.


The message is pretty well understood - the only difference is that the monorepo (think of it as a service in and of itself) and its associated tooling do get seen as "Google-specific."

Bazel in general has really awful support.


Google continuing to use gRPC and protobuf internally and Google continuing to invest in the open-source projects are not the same thing. It being so central to Google isn't necessarily even a good thing for people outside Google; it means there's a lot of potential for changes which are very good for Google but very pain for everone else.


protobuf might as well never disappear as it is so central to Google. gRPC however is hardly used internally compared to stubby which is the actual essential version


Depends whether you consider Google Cloud internal to Google.


even google cloud is not very dependent on gRPC

as far as I remember most of the API is built on REST


"Not very dependent" is subjective. The objective relevant take is it is a required dependency of parts of officially supported APIs of major GCP services that have large paid customers with SLAs. It can't go away anytime soon.

Google may have Stubby as the primary internal RPC, but several other large companies rely primarily on gRPC and have contribute to it and have incentives to keep it alive, e.g. Dropbox, Netflix, even parts of Apple and Microsoft. In fact, Microsoft's support of gRPC in C#/.NET is arguably more first-class and polished than any other language.


Fair enough

Although this might have some external implications, most of GCP does not rely on gRPC, even the external customers are not usually dependent on gRPC when using Google services.

Correct me if I'm wrong but gcloud uses REST and so do the libraries google publishes for using GCP apis in different languages.

The question is can Google stop supporting gRPC, protobuf or stubby tomorrow, and I still think gRPC is relatively at risk


> so do the libraries google publishes for using GCP apis in different languages

Not true. Many Google Cloud client libraries do support gRPC.

> still think gRPC is relatively at risk

I would agree with you that relative to protobuf and stubby, gRPC is less critical to Google infra. Yet, in absolute terms I would put the probability of that happening for any of those in the next couple decades at zero.


you are right, I checked now and these are based on gRPC

for some reason I remembered these used REST


That info is a bit outdated. All but the oldest APIs (most notably GCE) support gRPC out of the box.

For newer services, there is an automatic 1:1 mapping between the REST and gRPC APIs, except for features like streaming RPCs which have no REST equivalent.


It supports gRPC but that's not the commonly used flow, even by google cloud actual UI

The REST apis are converted to stubby internally, not gRPC which what makes it relatively disposable


The argument that gRPC is disposable because everything is stubby internally applies equally to REST. And I don't think anyone is arguing that REST is disposable.

I'm not sure what part of GCP you work in, but in my experience, the big customers are using gRPC, not REST.


It isn't.


(disclosure: ex-grpc-team here)

Indeed. I'm quite confident there's never been an RPC library with so many man-years invested in it. Last month was gRPConf and it appeared it was as staffed as ever, if not more, and Rust is being adopted as a new first class language too.


any videos from the event? I saw bunch of slides (pdf, ppt), but that's about it...


Pretty sure videos are recorded for YouTube. I know my own talk was. I expect them to be posted this week or next.


Thanks! Can't wait to watch them!




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: