Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

You're making it seem like AI will replace coders. Do you think Dall-E will replace artists, or just make an artist's job different?

IMO it's more like a coder using AI to make their job easier. It's still up to a human to come up with the individual problem the function solves, architect a solution from multiple functions/objects/etc, come up with a data model, and so on and so on. The AI just generates the code itself. And at least as of now, the code needs to be double-checked.



> You're making it seem like AI will replace coders

I find it best to accept it will most likely replace all of us, from doctors to coders to even psychotherapists. Won't happen next year but 10-20 years is a very long time this thing keeps getting better. Eventually we won't be able to tell if its a machine or a brilliant superhuman. The bummer in all of this, in my view, isn't the loss of jobs; we'll find what to do. It's the transition period - the accounting wizard or the brilliant doctor losing their jobs and status and becoming kindergarten teachers or care takers or unemployed. Nothing in their upbringing or life experience prepared them for such a thing ... so that's probably gonna be rough for many people. But once most people went through the transition it won't be bad. Society I believe will be better off. We will stop being obsessed with money and status and spend much more time with family and friends. Entertainment will be insanely good and so will healthcare. Possibly medicine to make our moods better. It could be utopia.


> We will stop being obsessed with money and status and spend much more time with family and friends. Entertainment will be insanely good and so will healthcare.

As nice as this would be, I think there's roughly 0% chance of it happening. Over the past few millenia humans have doubled productivity per capita a ridiculous number of times. None of those leaps led to an end of status seeking or a transition towards mostly leasure time for the masses.

Instead I expect more of the opposite from these developments. Power will get increasingly concentrated with people who have very little interest in the needs and wants of the plebs.


> Power will get increasingly concentrated with people who have very little interest in the needs and wants of the plebs.

At least in OpenAI's case I think they take this thing very seriously (Sam Altman doesn't strike me as evil one bit, quite the opposite in fact https://www.youtube.com/watch?v=DEvbDq6BOVM). In fact most of the tech elites don't seem evil to me, if they only cared about money Zuckerberg and Gates wouldn't have pledged away all their wealth. Some of them are as you describe but I think most of them are actually somewhere in the progressive axis.


Your argument is that the philanthropy of the ruling class will fulfill the required redistribution of wealth?

So far it hasn't worked out too well...


Not quite. The argument is that our tech elites aren't so evil to try to violently take over and that us masses can simply vote a more (much more) socialist system.

It's up to us. Indeed easier said than done in the current dysfunctional and polarized politics of ours but we can still do it.


We have pretty different views on tech elites. I don't think they are evil per se, but most are greedy and generally lack empathy. It's almost impossible to go from mildly wealthy to multi-billionaire status without some aggressive wealth accumulation.

For the philanthropy, I'll just note that these pledges don't involve literally transferring 99% of wealth out of their control. The vast majority of the pledged money goes to a trust the person controls, organizations they have some relation to, or just stay completely in their control for years with only vague non-binding commitments to eventually donate it. In return for this largess they get significant reputational and tax compensation.


Currently, the economics don't stack up. How does society handle it when the best jobs are automated? When all jobs are automated?

The Star Trek post-scarcity utopia scenario feels very unlikely; Mad Max-style scrapping for leftovers while Musk, Bezos, et. al. live behind walls feels infinitely more probable.

How do you implement UBI when a huge proportion of the political class is vehemently against it. Maybe we need to AI politicians, so they start to figure it out?


I don't have anything figured out but I think we can do it as a society. I do think we have a very negative biased view of the ultra rich. Zuckerberg, Gates and Buffett all give or will give all of their wealth to society. So not all of them are the same. I think Musk will probably also give most back eventually though I'm not sure he has said anything yet.

And remember, we are still a democracy. We get to vote. We control the army and the police and all institutions. If we decide that this capitalism isn't hot sh* anymore we can change it. What will the evil billionaires do? (this sounds like a good straight to DVD movie actually...hey GPT write me a script about this)


> But once most people went through the transition it won't be bad. Society I believe will be better off. We will stop being obsessed with money and status and spend much more time with family and friends. Entertainment will be insanely good and so will healthcare. Possibly medicine to make our moods better. It could be utopia.

First, I don't see "utopia" as likely; furthermore, I have a suspicion it may be impossible, given human nature.

Second, even the argument that society will be "better" demands much more reflection. The implied argument above is only a sketch. I don't find it convincing much less plausible. I'll call attention to four points (implied from above):

1. AI will replace humans in most or all professions

2. AI quality will be much higher than the previous human levels

3. A broad swath of people (using some notion of equity and fairness) will have enough money to live happily

4. "We'll spend much more time with family and friends"

Each of the four points are quite uncertain. Furthermore, even if `k` is true, `k+1` does not follow.

Who would like to flesh out some ways the sequence (1, 2, 3, 4) might happen?


Yes of course you are right, we don't know anything yet I agree. I am speculating a lot here. But I'm a believer in "intelligence as a commodity" as Sam Altman put it after seeing GPT3.5 so I think points 1 and 2 will be reality. 3 follows quite naturally to me but only in Western societies... Putin will have different ideas.

Anyway speculating is fun but you're right its just speculating. My main point is we should always keep in mind this could turn out to be great .


I'm not sure 3 follows. A walk downtown of any major city in America shows what society does for anyone who's work can't be commoditized or aren't sitting on an existing pile of wealth such that they don't need to work.

They get nothing but tents and shame.


That's because people like you and me (I'm assuming you're somewhere in the comfortable middle class) vote in for this system. Because so far we've enjoyed a reasonable quality of life. If that's no longer the case, we can vote for other leaders and other systems.


Not really unfortunately, the US hasn't been a democracy reflecting the collective will of it's voters for some time now.

https://www.bbc.com/news/blogs-echochambers-27074746


But people under stress might opt for a some sort of religeous acceptance of hardship, or a cult, or fascisim first.

Where are the examples of the middle voting against the extremes when times get really hard?


Fair enough :)

_And_ if since we care about our AI-interdependent future, more of us (as in the people here on HN) need to wade into the gory details, including ethics and the current power structures. The "technology" (as in algorithms, data structures, hardware, etc) is arguably the "easy" part. There are plenty of existing incentives and structures to keep those _moving_. But moving in what direction? Even the notion of an "ethical compass" seems antiquated in light of current technology. We may have to reframe everything. This is a big challenge.


> It's the transition period - the accounting wizard or the brilliant doctor losing their jobs and status and becoming kindergarten teachers or care takers or unemployed.

Yes, this is a big problem.

However... a lot of people would enjoy being teachers, albeit with significant improvements to the educational systems.


We can enjoy it if there's more of us. Each class can have 5 teachers instead of one teacher on 30 kids. Government can create those jobs, take the hundred of trillions created and redistribute it. Marx was right I think capitalism eventually kills itself ...we won't need it anymore. Arguably we already don't need the aggressive version we have now but soon enough it will be clear we don't need any version of it. The means of production (AI, robots and land) will be transferred to the people who will all receive basic income, free services and (if they want) jobs created by the government for the greater good. I don't think its a dystopia.


> The means of production (AI, robots and land) will be transferred to the people who will all receive basic income, free services and (if they want) jobs created by the government for the greater good.

This is a prediction?

Given human nature and the diversity of people (w.r.t. rationality, religiosity, morality, capability, and so on), it is very much an open question about (a) how AI capabilities will develop; (b) how they will be paid for... (c) and by whom; (d) to whom will benefits accrue; (e) how will society change.

These are broad, sweeping questions. Plenty of fodder for imagination, hope, transformation, cynicism, backsliding, or even despair.

If I were to make a bet, on our current trajectory, I see some key factors in tension:

1. educational quality, in absolute terms, increasing _and_ being more equitable

2. educational quality, in relative terms, continuing to be very unequal and probably getting more so. As one example, who has the resources to direct computationally intensive AI experiments? There are (and probably will be for a long time) gatekeepers for these resources. People that mix in this circles have a huge advantage. This makes me wonder if "exclusivity leads to inequality" is a saying from some philosopher.


> Marx was right I think capitalism eventually kills itself ...

I doubt very much that this is a testable theory. I think it is primarily a normative one.


As a resident of New Orleans I assure you the healthcare is not insanely good


>Do you think Dall-E will replace artists, or just make an artist's job different?

If an ad agency or a magazine publisher can get a custom illustration that works for my purposes from Dall-E, then they ain't paying no artist. Not theoritical, many already do use those generated images.

That's not just "making the artist's job easier". It's taking jobs from artists (well, illustrators and graphic designers at least), especially in the cheaper end of the business (e.g. not Nike, but your local Pet Store chain, restaurant, or news outlet, sure).


People will tire of DALLE eventually. We are great pattern matches, and we'll start to see the patterns that don't measure up. Then it'll be all about the next AI engine, and it'll need its own corpus of work. Who's going to create it?

So yes, not OP, but I think artists will still have jobs, although less of them, and the job will be different.


There is no indication that we don’t have enough images already available and training time is the main bottleneck. Also openai clear showed that DALL-E could generate avacado chair even though there is nothing like that in the dataset.


There's still a matter of taste that Dall-E can't provide. It will only spit out what you tell it to, and if you lack good taste, then you'll still produce something inferior


It must be relatively easier though to be a discerning consumer than having the ability to produce something yourself (without using AI?)


No, but it is very likely that it reduces to demand for coders and artists.

Sure, there will always be demand for a Linus Torvalds or a Damien Hirst. But will there be demand for Coder #365968 at Infosys or a graphic designer pumping out $50 ad banners?

We’re looking at the possibility of some white collar jobs having the same income disparity as creative jobs. Just as there are some musicians who make hundreds of millions while the vast majority barely make ends meet, we may have a future where the star programmers make millions while the average players are automated out of the competition.


> No, but it is very likely that it reduces to demand for coders and artists

Eventually, I agree yes. But there could be a boom of huge new investments into A.I products, more devs needed and in fact teams getting way more requirements since they are more productive. Imagine the stuff we will be able to build in things like search, personal assistants, biomed, in fact what industry won't this affect? Its unbelievable to me that people are now saying Google search might become obsolete, that's absolutely crazy. Not many people saw that one coming. But at least initially I don't think GPT models will be able to do everything themselves. So its very hard to determine that say in the coming 5 years devs will find it more difficult to get a job. 10-20 years from now sure, I don't see how anyone gets a cognitive job anymore let alone devs. In fact our entire school/university system is probably obsolete, kids are probably learning skills they won't be able to apply in any job market. We need to start think about stuff like teaching kids emotional intelligence, spirituality and meditation...not cramming for a math test.


DallE, Midjourney, and StableDiffusion are already taking jobs. Illustrations, album covers, blog post images. People are making beautiful books and playing cards.

Midjourney V4 is amazing. It spits out absolutely beautiful images.


To be precise, AI is "taking jobs" that could have already been commoditized long ago if people in developing countries understood how Fiverr worked and had set up "art sweatshops" to serve demand.

There's enough art talent already around in the world to entirely commoditize the supply of it for the little one-off no-style-guide-to-follow commissioned works you're talking about. It's just not currently a liquid market — supply and demand find it hard to discover one-another — and so a true market-clearing price can't be set.

Meanwhile, AI is not currently taking anyone's advertising-campaign graphic design job, or anything else where the "efficient-market price" (in a world where human "art sweatshops" existed) would be more than $5.


How many of those people would have actually paid for an artist otherwise though? I myself am thinking of playing around with game dev for fun with the thought of using image AI to generate the art. Were it not for that, I'd just use free textures, or more likely, just spend my time doing something else entirely


It will replace coders with thinkers. The number of people that have ideas good or bad is large compared to the people that can implement those ideas as these bots/ai get better they will produce a lot of code. At the start it will probably increase the amount of code produced and the number of coders but with time as the ai gets better need for coders will decrease. We will need people that can think or imagine ideas rather than coders now these people might still be considered software developers but they will not be coders in strictest sense of the word.


Eventually. 20 years, 50, maybe 100. But eventually.


... people still freaking out machines will replace someone, since what, industrialization 200+ years ago? People still work, just maybe different jobs or more intellectual jobs.


Except this time it's different https://youtu.be/WSKi8HfcxEk




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: