CS and DS people are getting more applied and gaining domain expertise, and can do a lot of economics work now. Academic economists, especially those who primarily do data science / big data, seem to basically be doing Masters-level data science projects for their Ph.D. The hard part in their Ph.D.s is collecting the data, which used to be a very manual job that relied on connections, but more of them are getting them or imputing them from public sources so it's not that impressive anymore.
Speaking as someone who has attended 3 economics Ph.D. defenses in the past two years.
Data science wasn't even a degree you could get 20 years ago. Twenty years ago if you were interested in what is now called data science, you were getting a degree with some kind of exposure to applied statistics. Economics is one of those disciplines (through econometrics).
No, I did stats as part of economics around then, and it's nothing like modern DS. It overlaps a fair bit, but in practice the classical stats student is bringing a knife to a gunfight.
The practice of working with huge datasets manipulated by computers is valuable enough that you need separate training in it.
I don't know what's in a modern stats degree though, I would assume they try to turn it into DS.
Data science is basically a marketing title given to what would have been a joint CS/statistics degree in the past. Maybe a double major, or maybe a major in one and an extensive minor in another. And it's mostly taught by people with a background in CS or statistics.
Like with most other academic fields, there is no clear separation between data science and neighboring fields. Its existence as a field tells more about the organization of undergraduate education in the average university than about the field itself.
The Finnish term for CS translates as "data processing science" or "information processing science". When I was undergrad ~25 years ago, people in the statistics department were arguing that it would have been a more appropriate name for statistics, but CS took it first. The data science perspective was already mainstream back then, as the people in statistics were concerned. But statistics education was still mostly about introductory classes of classical statistics offered to people in other fields.
No. Data science is different than statistics, because it is done on computers. It also uses machine learning algorithms instead of statistical algorithms. These advances, and the shedding of generations of restrictive cruft - frees data scientists to craft answers that their bosses want to hear - proving the superiority of data science over statistics.
yeah, we called that data mining, decision systems, and whatnot... mapreduce was as fresh and hot as the Paul Graham's essays book... folks were using Java over python, due to some open source library from around the globe...
essentially, provided you were at a right place in a right time, you could get a BSc in it
His work and department was very quant heavy. I'd say the majority of his students spend most of their time in Python/R cleaning datasets running models
I'm not disagreeing with you, and I also know political economists from that time who complain that their discipline is changing. It just has very little to do with what this article is discussing.
Or cheating in other ways, like collusion rings [1], reviewers using LLMs to review, or authors putting text in their papers so reviewers using LLMs to review will give them a favorable review [2]
3% is actually not bad for what it gets you: fraud detection, not having to lug cash to the bank every day, reduction of theft and employee mistakes, better recordkeeping for taxes, a way to handle refunds, and cleanliness. Small merchants often use Venmo where I am, and they save 3% in exchange for that awkward dance of sending payments.
But why do we need a 3rd party middle man fornwhat used to be a private transaction between two individuals?
If the government is going to replace cash with technology, why can't it do it itself and not involve some trabsbational bohemoth like Visa or Mastercard?
I'm generally wary of suggestions to overhaul a working long-lasting sociotechnical system. Everything that has been around for decades is patches on patches.
The the world wide web, ipv4, IRC, the US constitution, the telephone system, email, were all developed in older times, and as times change, they develop flaws which need (ugly) patches. That's a sign that they're successful, that no one company has come and replaced it and then killed it when things got inconvenient.
Think about the opposite situations: Google Groups replaced Usenet, and now it's completely dead; Google Reader took a lot of marketshare from RSS and almost killed it when they shut it down.
So let's not hastily destroy the only online communication system we have left, that's not controlled by a mega tech company.
I agree normally, but in this case email is just not fit for the original purpose anymore. It's undermining its own relevance in today's world. We don't trust it anymore and its key aspects like decentralisation are being undermined.
Personally I think google groups was not a replacement as such, more an extension of usenet. Usenet also had some very serious problems that made it hard to maintain and more specialised discussion fora took over. I don't think it was google groups as such, that was more one of the many ways to interface with usenet.
I'm absolutely not advocating replacing it with something a mega tech company has their hands in though! Any replacement should be decentalised. I would love to see an email replacement tech along the lines of what Matrix is doing for the chat world.
It's an obligatory warning, otherwise people (even here on HN) will decry your methods as mainly due to privilege, and irreproducible for the average perso.
Agreed, Apple is far worse on being monopolistic and draconian, but people give them a pass because they have cool design. Everything they're doing with iMessage, the iOS and Mac App Stores, AirPods, Safari, etc. are all extend and embrace, without the trepidation that Microsoft has had.
I think it is because using Apple is not mandatory. Apple is an aspirational brand, you use their products because you want to. Microsoft's products are not; they pushed hard their product everywhere, and were difficult to avoid, if you wanted something else. That itself caused a much friction, and when they start to steer you somewhere you don't want to go, their products are not exactly going to be your favorites.
Because each person still uses about 10 full wardrobes in their lifetime, 100,000 prepared meals, a few dozen smartphones and other electronics, a mountain of disposibles like diapers and napkins and takeout containers and masks. People need a lot of resources for even a modest standard of living.
Even more awkward is when you're having a conversation with someone who is trying even harder to get you to talk about yourself, while you're trying to do some of the same. Then it's a competition to see who can say the least before asking "and what about you?"
That seems like a dangerous line of thought. Imagine a software engineer (maybe one who writes code for airplanes or medical equipment) writes some sloppy code late one night and puts it on their open source github, and the commit log is "this code rocks."
You could make the same argument that As a software engineer, you have to be able to identify and write high quality software to be effective at your job. So should your employer fire you for writing this code outside of your regular job duties?
What about if a friend shows you their code, and you say "that looks good" but it turns out that this code is vulnerable to a key security issue? Should you get fired from your security job?
I think we'd agree not, and more broadly, that making a statement or doing a task related to your job but outside of your job's responsibilities should not be sufficient to fire you from your job. Especially when it's clear you are not representing your employer, by using pseudonym.
Well, if you're in a position of having to say "Yes, I used my work email to create a Disqus account, and most of those comments are mine, but not the one about Muslims hating America and being deported..."
A professor naturally speaks with the authority of their institution, and reputation is far more easily maimed than repaired. The decision for Temple University to fire her is pretty obvious.
I think a better example would be the case where the university professors/students committed known vulnerable code to linux kernel and they banned the whole university. If you're doing something ethically questionable and potentially harming others (spreading hate and conspiracy theories can do real harm), it could be grounds for not believing they are capable of doing their job.
I don't think it's a freedom of speech issue at all, it's a I want freedom of speech without social consequences issue. Sorry. If you're spreading around hate and lies - as a journalism professor no less - there should be consequences. You don't deserve the responsibility of teaching others.
I don't see how that's a better example, and you missed the point. Your example is about someone doing something unethical as part of their job. My example was about someone doing something unrelated to their job duties. That's the entire point I'm making, that things done outside of job duties do not reflect the competency of the job.
If you read the grandparent post, it's about making a judgment about whether one is effective at their job.
Speaking as someone who has attended 3 economics Ph.D. defenses in the past two years.