Hacker Newsnew | past | comments | ask | show | jobs | submit | ratbr's commentslogin

Not a book but the novella “the machine stops” by E. M. Forster. I read this in the late nineties and it still pops up in my head frequently — especially in our age of the internet.


I found E M Forster’s novella “The Machine Stops” in the same league as the above: it is about the human condition, and technology is just a setup to talk about that.


An aside: the title is a great example of context sensitivity. On first glance, I interpreted the title as some serious bug in garbage collection algorithms/Systems (as in, they open some library that was an abandonware).


Remains to be seen how this impacts the US universities. Part of the incentive for the international students to choose the US over other countries for higher studies is because there are some seriously great professional opportunities after the studies if you can immigrate. For example, our industry is centered in the US, the US is truly a super power in research, the US mindset (startups, relatively merit-driven systems) etc compared to other developed western countries.

If there is no clear path to immigration, and if high quality education becomes relatively easily available (see: western universities setting up centers in the east), that can dry up the desire to choose a US based university I think.


This is fantastic. The exercises are non-trivial enough to be useful. One minor suggestion: if you can have the favicon set to your gopher, it can be great. Mine is a potentially narrow use case, but after logging in, I pinned this to a tab in safari, and the G throws me off a little.


Pinned tabs actually have their own icon: https://developer.apple.com/library/content/documentation/Ap...


Thanks. I'll have to go create an icon for this. So many custom icon formats :\

Edit: I believe I have an icon set here now. I might need to tweak it but it should work in the meantime.


I recently got turned onto RealFaviconGenerator.net after researching the various icon formats I might need to support and finding a nice in-depth stack overflow answer [1] from the author of that website. I would be surprised if it did NOT support pinned images.

[0] https://realfavicongenerator.net/

[1] https://stackoverflow.com/questions/4014823/does-a-favicon-h...


This is indeed the site I used to fix the issue :)


Thanks! Can you possibly email me a screenshot so I can fix it? jon [at] calhoun.io


Sent. First world problem admittedly.


Serious question: what is an AI coprocessor technically? Some machine learnt models burned on a chip? Or some kind of a neural net with updatable weights?


One example is: https://arxiv.org/abs/1704.04760

There are many potential designs for these things, but the first gen TPU is one that works, is in production, and has been described in a paper. But you have to differentiate if you mean an inference engine, or something that can also do training. For HoloLens, it's probably going to be an inference unit, which means it'll possibly look something like a TPU, perhaps with more specific hardware support optimized for convolutions (which are very important for visual processing DNNs these days), as the NVidia tensor units are.


It is not well documented by anyone. However, the expectation is that it is a matrix or convolution coprocessor, as this is a common operation in deep neural networks (for both inference and training). For instance, NVIDIA says they are supporting 4x4 convolutions with the tensor unit.


The AI coprocessor is probably the first processor designed directly by the marketing department...


Oh man, that ship has sailed!


I was in the audience at CVPR when it was presented. They were doing semantic segmentation using resnet-18, so I'm guessing it speeds up convolutions and some linear algebra during inference. I'm guessing it won't be used for training.


A whole butt ton of GPU-style FMA and low precision float multiply ALUs would be my guess


How low precision can you get and still have it be useful?


Google's TPU proves that 8 bits is still good.


Note that the TPU used 8 bit integer math, not even floating point.


1-bit, binary neural networks work.


According to the linked article, this coprocessor seems particularly focused on Deep Neural Networks (DNN), so it does sound like a updatable weight neural network evaluator.


Probably also enough high speed memory to store the weights without needing to go to RAM.


In my experience, the notion of "change the world" has no age bar sadly. Also, "change the world" is often thrown about in arguably shallow contexts. But then look at how Facebook got started, and it is fair to say that it has changed the world. My point? I agree with you, but I am not sure if that's just cynicism because arguably "shallow" companies have changed the world for real.


Facebook might have changed the world, but I don't think it's for the better.


Fragmented messaging/phone calls and inability to have "seamless" conversations. For example: WhatsApp, iMessages, google voice, SMS, etc. Also for phone calls: FaceTime audio, carrier routed calls, google voice, Vonage extensions, etc.

How I wish there were central dashboards that intelligently sift through call/message history for each contact based on context across apps and just routes the message/call through right primitive.


I know Google Hangouts tried doing some SMS integration, though I've read it was pulled in Nexus 6.

I've tried looking for a better solution and couldn't find one. Out of curiosity - what is your day job?


Sorry I never visited back. Yes, google tried it with Hangouts. Internationally, you can use Hangouts for wifi calling through google voice, but that did not work in the US for some reason last I checked.

I design/write software; mostly distributed middleware in the internet industry.


I am clueless about the right etiquette, but can anyone share some high quality publications in the similar vein? Mainly interested in "head content" that is not a book, but a set of relevant, contemporary articles around our trade.


I find a lot of stuff along these lines on http://news.ycombinator.com and http://lobste.rs; there's also /r/programming and /r/coding. I post (links to) a lot of stuff at https://twitter.com/kragen.

I used to read LWN regularly, but now I only read it when someone links to it.

Beyond link aggregator sites like these, there are dozens of individual people whose writings I read frequently, if not every time I come across anything new they've written, because it's reliably high quality: Landon Dyer, James Hague, Yosef Kreinin, Raganwald, Tim Bray (ongoing), Dave Long, Bill Gosper, Darius Bacon, Fabian Giesen (ryg), viznut, John Carmack, Bret Victor, of course Paul Graham, Bunnie Huang, Seth Schoen, Avery Pennarun (apenwarr), Herb Sutter, Stepanov, Alexandrescu, Oona Raisanen (windytan), Jon Blow, Linus Åkesson, Ian Lance Taylor, Oleg Kiselyov (although this is very difficult!), Ian Piumarta, Mark-Jason Dominus, Aaron Swartz (RIP), and Randall Munroe.

On the hardware side, Anandtech is pretty good.

Above and beyond that, for overviews of objective things with links to further reading, I've found Wikipedia to be pretty consistently good.


Some quasi-journals have been moving in a "magazine-style" direction with broader-interest articles than a traditional journal, in order to fill some of the trade-magazine gap. Some more successfully than others.

A few that I sometimes read:

ACM Queue: http://queue.acm.org/. Online-only, free. Focus is largely on software engineering and some related areas (sysadmin, devops, architecture, reliability engineering, etc.). Probably the closest to a Dr. Dobbs style magazine of the ones I list here, with mostly industry authors.

Communications of the ACM: http://cacm.acm.org/. Most recent issue is free online; print subscription w/ digital back-issue access is $99/yr. Mix of academic and industry articles, leaning more academic, but written in a more accessible and concise style than a typical CS journal. The print version also includes some articles from online-only ACM publications (like Queue).

AI Magazine: http://www.aaai.org/Magazine/magazine.php. Issues >1 yr old are open-access online; new issues are subscription-only. Print+digital subscription is $145/yr. A bit like CACM, in being journal-ish but with an aim at more readable general-interest articles. Some insider-ish news/column type stuff as well (reports on conferences and workshops, etc.).

You may or may not be happy with any of those. The move towards a more magazine-style format is pretty new, and I think still being experimented with. I personally like glancing through and reading parts of CACM and AI Magazine, and less often Queue. I also know some people who like the general-audience IEEE magazines (IEEE Computer, IEEE Software, IEEE Micro, etc.), but I haven't read them enough to have an opinion.


I think FB does not have to be a PayPal, and there are many good reasons outlined in these comments already why that is a bad idea.

However, it can become some kind of market place. It already knows user locality, it already has people flocking to it. It already has big brand stores sharing discounts and coupons to consumers.

All it needs to do is finish the last mile and allow those brands to actually have a "Facebook-only Sale", for example.

Also, it can become a great "local market place", and can help local businesses to sell to local customers.

TLDR: not just ads, and not necessarily a payment gateway, but a niche market place could be it.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: