Comment by jayd16

Comment by jayd16 3 days ago

1 reply

You mean like it turned on Hibernate or it wrote some custom rolled in app cache layer?

I usually find these kinds of caching solutions to be extremely complicated (well the cache invalidating part) and I'm a bit curious what approach it took.

You mention it only updated a single file so I guess it's not using any updates to the session handling so either sticky sessions are not assumed or something else is going on. So then how do you invalidate the app level cache for a user across all machine instances? I have a lot of trauma from the old web days of people figuring this out so I'm really curious to hear about how this AI one shot it in a single file.

eatsyourtacos 3 days ago

This is C# so basically just automatically detected that I had 4 object types I was working with that were being updated to the database that I want to keep in a concurrent dictionary type of cache. So it created the dictionaries for each object with the appropriate keys, created functions for each object type if I touch an object to get that one updated etc.

It created the function to load in the data, then the finalize where it writes to the DB what was touched and clears the cache.

Again- I'm not saying this is anything particularly fancy, but it did the general concept of what I wanted. Also this is all iterative; when it creates something I talk to it like a person to say "hey I want to actually load in all the data, even though we will only be writing what changed" and all that kind of stuff.

Also the bigger help wasn't really the creation of the cache, it was helping to make the changes and detect what needed to be modified.

End of the day even if I want to go a slightly different route of how it did the caching; it creates all the framework so I can simplify if needed.

A lot of times for me using this LLM approach is to get all the boilerplate out of the way.. sometimes just starting the process by yourself of something is daunting. I find this to be a great way to begin.