Reading Response: AI, Identity, and the Value of Human Creation
Arena
Arena was really interesting! I haven't heard of it before, but I think it is a really cool idea. It was a really easy way to see tangible differences between the two models. There was a large difference in quality of response. And it was most of the time the open source models that were better. Which surprised me a bit.
some thoughts on the articles
"My purpose here isn't to scold anybody per se. Well, almost. To those who have chosen to use fear and intimidation to help sell the agenda of the big tech CEOs who, in turn, have somehow managed to use coal-fired GPUs to capture society's output and sell it back to us, while converting a significant portion of the economy into an expanding envelope of hot gas: I not only scold you, I shun you. I have turned my back to you. That goes double if I once admired and respected you."
MMMMMM- Now THAT is a quote. It is an interesting separation of technology enthusiasts and people who are enthusiastic about making technology. And there is quite a difference. It is almost like a signal in bro-code. Its a virtue signal for performative red-pill culture or something. I would hazard a guess that the Dave Gauer is a little pessimistic about his people because he is grieving the change that he doesn't want. And I have been there. I understand the point of view. It's weird letting that kind of change go through you, I can't imagine what someone with decades of experience feels like if I had a mental breakdown with less than 5 years.
Bob Nystrom has some really awesome formatting in his blog that I want to steal if I am honest. However, his article was also equally awesome. As I may have mentioned, one big mental health break about AI ago, I was also struggling with this concept of utility and meaning of my work. While I think it is a bit philosophically dangerous to equate a thing to a level of "value" or "utility", because utilitarianism is really dangerous when making ethically sound stuff, I do like that Bob adds a meaning/emotional value to thing as well. However, I would still not ascribe meaning like that because I think it is a little pointless? My therapist often tells me to stop overthinking (what good computer scientist doesn't like to get into the weeds?). She asks me what I will do if I get to the bottom of a rabbit hole. Will finding an answer (if one even exists) solve my anxiety about the situation? The answer is most often no. Which sucks to hear. BUT! The good news is also no. Although letting go of the need for control and knowledge about how everything works and interacts is incredibly hard, there is a value to protecting your mental sanity.I think past me from four years ago would be extremely happy with where I am. I was always a bit unconvinced I wanted to be a software engineer or CS student, but I have really found a passion here. I have remained open to the possibility of the world and not really set a grand plan. And opportunies have come to me to be honest. So in that way I have no clue what future Will's life will look like. In terms of AI, I don't really think it has changed much. I think it is really over-talked about. Something something, productivity yada yada. But nothing that I have done is really different? Especially because I like to be creative and plan before making. I don't really like to just throw shit at the wall and iterate, I would rather things work in one or two tries. And to that point I really like that AI can get rid of the barrier of syntax for me. Language syntax and coding is maybe the weaker of my strengths, but creativity and understanding (which I AM good at) is not something generative AI can grasp super easily.