Try QuoteGPT
Chat naturally about what you need. Each answer links back to real quotes with citations.
" "I'm hoping that games can get out of this whole mobile gaming dopamine pump thing [...] and create worlds.
George Hotz (born October 2, 1989), alias geohot, is an American security hacker, entrepreneur, and software engineer. Since September 2015, he has been working on his vehicle automation machine learning company comma.ai. Since November 2022, Hotz has been working on tinygrad, a deep learning framework.
Chat naturally about what you need. Each answer links back to real quotes with citations.
Related quotes. More quotes will automatically load as you scroll down, or you can use the load more buttons.
Utilitarianism is an abhorrent ideology. [...] I think charity is bad. what is charity but an investment that you don't expect to have a return on. [...] Probably almost always [making the world better] involves starting a company. [...] I like the flip side of effective altruism: effective accelerationism. I think accelerationism is the only thing that that's ever lifted people out of poverty. The fact that food is cheap, not we're giving food away because we are kindhearted people. [...] [Universal basic income], what a scary idea. [...] Your only source of power is granted to you by the goodwill of the government. What a scary idea. I'd rather die than need UBI to survive, and I mean it. [...] You can make survival guaranteed without UBI. What you have to do is make housing and food dirt cheap.
Organize your favorite quotes without limits. Create themed collections for every occasion with Premium.
[About AI:] You could make an argument that nobody should have these things, and I would defend that argument, [...] and I would respect someone philosophically with that position. Just like i would respect someone philosophically with the position that nobody should have guns. But I will not respect philosophically "Only the trusted authorities should have access to this." Who are the trusted authorities? You know what? I'm not worried about alignment between an AI company and their machines; I'm worried about alignment between me and the AI company.