J

justaperson

43 karmaJoined

Bio

• Los Angeles-based writer, marketer, and content creator.

• Alumnus of the 2014 Warner Bros. Television Writers' Workshop.

• Former business owner, inventor, and creative director.

• Broad longtermist, but not certain about anything...

Comments
23

I missed this somehow...sorry for the late-reply.

My story may be somewhat irrelevant now, but I did a series of intro courses in different languages (HTML, CSS, JavaScript, and Python) to test my fit and better understand what kind of things I might build. During these few months, broad tech layoffs were increasing, while LLM/coding assistants were gaining traction. Recent grads from coding bootcamps (the route I was likely to take) were also having more difficulty finding work and the bootcamps themselves seemed ill-prepared for the LLM transition.

My read on the changing landscape was that even if the job market improved for junior coders a year or so out, I might not be much more valuable to a company than an LLM (and if I was, it was likely to be temporary). Without a lot of confidence that this would be a secure career path, I opted not to pursue programming.

How about you?

Was this recorded in any way? Would love to see/hear the talk if possible.

Thanks for your explanations!

Re: Questions

Apologies…I mean the questions your team decides upon during your research and interview processes (not the initial prompt/project question). As generalist, do you ever work with domain experts to help frame the questions (not just get answers)?

Re:  Audit tools

I realize that tools might have sounded like software or something, but I’m thinking more of frameworks that can help to weed out potential biases in data sets (ex. algorithm bias, clustering illusion, etc.), studies (ex., publication bias,  parachute science, etc.), and individuals (ex. cognitive bias(es), appeal to authority, etc.). I’m not suggesting you encounter these specific biases with your research, but I imagine there are known (and unknown) biases you have to check for and assess.

Re: Possible approach for less bias

Again, I’m not a professional researcher, so I don’t want to assume I have anything novel to add here. That said, when I read about research and/or macro analysis, I see a lot of emphasis on things like selection and study design — but not as much on the curation or review teams i.e. who decides?

My intuition tells me that — along with study designs — curation and review are particularly important to weeding out bias. (The merry-go-round water pump story in Doing Good Better comes to mind.) You mentioned sometimes interviewing differing or opposing views, but I imagine these are inside the research itself and are usually with other academics or recognized domain experts (please correct me if I'm wrong). 

So, in the case of say, a project by an org from the Global North that would lead to action/policy/capital allocation in/for the Global South, it would seem that local experts should also have a “seat at the table” — not just in providing data — but in curating/reviewing/concluding as well.

With this post almost a year old now, I was curious if any of the commenters who were interested in switching to EA-related work have pursued this route. If so:

  • Have you been hired?
  • What was the job seeking process like for you?
  • Any recommendations to other mid-career professionals looking to pursue this path?

Thanks for sharing. I'm not a professional researcher, but spend a fair bit of time researching personal projects, areas of interest, etc., and enjoy learning about different exploration frameworks and processes. As a generalist myself, it can sometime be difficult to know if you're adding signal or noise to a picture you've yet to fully envisage -- particularly where a high-level of outside domain or technical knowledge is necessary. 

In my experience, beneficial answers are often the result of pinging the right sources with the right queries. This alone can be a difficult chain to establish,  but there's a deeper layer that strikes me as paradoxical: In most cases: the person/team/org seeking knowledge is also the arbiter of information. So...

  • How do you determine if you're asking the right questions?
  • What is your process for judging information quality?
  • Do you employ any audits or tools to identify/correct biases (e.g. what studies you select, whom you decide to interview, etc.)? 

Thanks for the referral. Interesting post -- even if much of the technical-speak is lost on me. What I gathered is that nobody really knows if/when software engineering will become an unskilled job (no surprise) but, a) many are confident that it won't be anytime soon (at least, for the discipline as a whole), and b) junior developers are the ones that LLMs are likely to replace (est. 1-3 yrs.).

While much of the thread's early sentiments echo replies here, there's a divergence concerning newer engineers as the conversation continues. It's these bearish predictions that worry me. I don't need to make six figures, but I can't invest time (6-12 mo.) and money (courses, bootcamp, etc.) in a career path where newbie "escape velocity" is unlikely. More to think about...

No, I've already made the decision to leave copywriting (unless an opportunity to have an incredible impact came my way).

Software engineering and data science were the two paths I was considering but engineering won out 1) As an end-to-end (idea to product) creation tool, and 2) Iit doesn't require me to first become proficient in probability/statistics . The latter is something I eventually hope to do but, financially, I can't  afford to ramp up in math, then data science, then find a job. And while it's estimated that data science roles will grow at a faster rate than jobs in software engineering, there are far less overall spots available in data science . Being at the midpoint of my career, my ability to make a meaningful contribution somewhere as a software developer seems more likely than as a data scientist. Lastly, I'd assume data science would be the type of skill that AI will replace before software engineering (but that's a huge guess). 

Thanks for that perspective. Given that I don't have experience in the programming space, I couldn't project a timeline between fully automated software production and AGI -- but your estimate puts something on the map for me. It is disconcerting though, as there are many different assumptions and perspectives about AGI,  and a lot of uncertainty. But I also understand that certainty isn't something I should expect on any topic -- let alone this one . Moreover, career inaction isn't an option I can afford, so I'll likely be barreling down the software dev path very soon.

I'd say marketing is business-critical, and the difference between phone-it-in, good, great, and stellar content is important to bottom lines (depending on industry/product/service). That said, if the general point is that grammar issues on a site will have a lesser negative effect than buggy code that crashes that site, I agree. I'd also agree that unless you're a marketing or content agency, marketing and content may be part of your business but they're not the core of it. In contrast, almost every business in every industry runs on software today...

Still, I don't know how long things like scale, complexity, and strategy will be meaningful hurdles for LLMs and other AI technology (nobody does), but it feels like we're accelerating toward an end point. Regardless, software engineering seems like a good aptitude to add to the toolbox, and it's good to hear that I may not be too late to the game.

When it comes to refining AI generated code, do you imagine this being done within organizations by the same amount of programmers or that LLMs could be managed by fewer senior (or even lower) level engineers? This question is inspired by my observations in marketing, where the stock of full-time writers appears to be going down. I totally get that LLMs can’t create their own prompts, debug every line of code, or approve products, but do you think they’ll start allowing orgs to complete product development cycles with less engineers?

Great point that coding isn’t an end in itself. In addition to seeming fun/interesting, I'm looking to learn this skill for greater domain range, technical building ability, and professional autonomy. Knowing how to code could eventually help me launch a startup or support an EA-related org. And yeah, earning to give while I ramp makes this  path even more attractive. Many great points and thanks for the encouragement!

Load more