Hacker Newsnew | past | comments | ask | show | jobs | submit | HeyZuess's commentslogin

> A company giving money to the parents of the owner is fraud.

None of those points are actually fraud on their own, except potentially one. The companies use of customers funds would likely fit the definition of fraud, this is unlawful gain.

A company can legally give out loans. Companies give out signing bonuses, salaries, gifts etc, the problem comes down to whether it fits the definition of fraud.


Yes and no. I would like my children to grow up to be a better version of me, one where my mistakes are a foundation for their education, where the good I do they do better etc.

I made up my mind a long time ago that my children can be anything they want, as long as it is good, as long as what they want does not hard others.

I want to introduce my kids to programming, but I also want to see them experience art, sport, and music and everything else the world has to offer. I don't care if they are programmers, I care that they get a bit of an experience at it.

I would rather they grow up to be better than me.


I work in a couple of small teams for a few companies. I have a full time gig and a few part time ones. All the teams are remote, and many are in different timezones and varying a lot, even with language sometimes. In some cases the entire company works remote, from HR to sales to dev.

I don't get the communication issues, nor do I get the project planning issues. Some of the companies are established while others are new, some are one product, others are many.

We have regular catchups, some of the teams I work with even meet in person occasionally.

I think though in the post the 2 bits which stand out to me is the VC reason, but also having a small team and not being able to align them.

I am going to make the judgement that small means less than 10, it is hard to manage teams but it's not that difficult. Teams are not hive minds, and if you cannot align a small team of people who probably spend the majority of their time working individual, then there is something else missing.


The money collected from exclusion fees is quite large, 100s of millions in some cases in a year. The very fact that the collections are so big points to a possible fact that exclusion fees are not a deterrent or stop/reduce locals from gambling.


The problem is, is that it is just not Russia. Other countries are starting to learn that they could be sanctioned or cut off. Not only that the assets of private citizens can be seized easily. Wars happen, conflicts happen after all something similar could have been said about the Middle East.


Buffet and Munger are value investors, there are multiple strategies in both investing and trading. Jim Simmons might be a more relevant example here in regards to trading if you are comparing the two.


They are value investors… who definitely know the technicalities of the market, and have at times taken serious size option positions. They say to just buy index funds because otherwise it’s a full time job with all the research and accounting.


I don't know how I feel about this article, especially around the definition of understanding.

I have two children under 6, and if you watch their learning patterns they follow a very much the same path, and mimicry is part of that. They learn the subject matter at hand through repetition, and that then build comprehension which is the basis for understanding.

I think ChatGPT has learnt and started to understand language based on the data given to it, it has built some understanding how language is structured as it is able to regurgitate language.

It has contextual understanding at some level and contextual awareness. One of the next logical questions would be, does it have intelligence and that I don't think is the case. It understanding is based on its understanding of the language.

I think the bike example is a great example, my son is learning to ride a bike. He has no understanding of balance as the article states, he has awareness of it, he knows he needs to peddle to move forward but he has no understanding of inertia, he has to steer but he has no understanding of physics, but he understands all of these components are required and he knows that via data "the experience of learning". I think it would be difficult without prompting to get him to explain his "understanding".

Now understanding is great, you can apply learned information and apply it to another subject. So we can use what we have learned from ridding a bike and apply it to a motorbike. I am unsure if this cross domain transfer of understanding is something which ChatGPT can do, but it does a great job of extrapolating data and apply it to learnt rules.


I think this is relevant to many subjects. I know nothing about cars, but I can probably pick up a book or watch a few videos about and have a reasonable basic knowledge, but there is no way I am going to go out and tune up a car. The learning of many core principles is quite easy, but gain a breadth and depth of knowledge is the difficult part.

A lot of this only comes from the application of skills. I have been programming for a very long time and the amount of information I know I don't know is huge let alone the information I don't know that I don't know.

But then again in my day job I probably do not need to know a lot more than I know, and within that context programming is easy.


> I’ve been using copilot for half a year now and it’s helpful, but often wrong.

I wonder if that is because of the training set, us humans are often wrong or different. If given a room of programmers and asked to implement a Fibonacci algorithm would they all get it right, would they all do it via iteration, recursion, dynamic programming. Co-pilot might not replace you, but it just needs to replace some of those programmers. Then add tools like automated AI reviews or integration tests for example and now you removed another population of tech workers.

I am not sure if that is cause for alarm or the fact that such improvements could be rather beneficial. Some tools will replace people, some will be assistive, and as they improve and other layers are added they will reduce the need for people in areas, improving efficiency and productivity. Robots in manufacturing for example, improve productivity and reduce human labor.

And this leads on to this, these tools are narrow to general and have achieved this in a very short period of time. The cost factors have also massively reduced, if you could pay OpenAI $10 a month to make 1000 mistakes and still deliver code, or a human $120k a year to do the same then which one would you target. AI might not be coming for your job soon but it will be taking away your options to get a job. This is not unique to AI it is the basis for any technological improvement vs labor, yes new ideas and opportunities may come out of this but I don't think they will be equal in volume.


I would say however if you are managing to stay with a range of lifestyle you are probably gaining from new wealth even though it is not extra wealth which you are accumulating in a bank. You are generating new money and that is new wealth. Every year that 80K is new wealth, I am pretty sure that is how the definition is in this case.

A person in Sudan is averaging around $460 US a year, while in the US it is like $74K, and thus Americans are accumulating wealth at a higher percentage.

Also the comparative lifestyle in regards to goods and services are much different. While it might be more expensive to live in the US the lifestyle is dramatically different. Wealth doesn't always mean accumulation.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: