Back a few agencies ago, I was tasked with the data update process for a multi-billion dollar company.
(1) they gave us an Excel file
(2) we saved the Excel as a CSV
(3) we uploaded both the Excel and CSV to a web server
(4) we used a PHP script on their server to process the CSV
(5) we exported the results of the script from their server
(6) we take the resultant CSV of the script, and used an Excel Macro to format it
(7) we upload the formatted final Excel sheet, which provides the data for the web app
(8) we manually updated rows in the database via phpMyAdmin to point the web app to the new files
Essentially, it was correctly formatting the CSV and XLSX files for salespeople to use to search products in the company's inventory.
That list is simplifying.. slightly. Reviewing my personal notes from the process at the time, it was actually twenty-seven (27) steps from start to finish. Although, to be fair -- step 27 was "Take a nap. You earned it." Which is quite accurate.
I don't have a copy of any of the data anymore (probably a good thing) so I'm not sure exactly how many rows the Excel sheet typically was, but I imagine it wasn't a small number. During the Excel processing step in (6), it was typical for Excel to completely freeze during the macro processing. It was also a heavily manual process with human intervention along many steps of the way -- I believe it took around three hours, start-to-finish.
I believe the original intent of the process was to have it all just run in one place, i.e. on the server. But at some point, "something broke" or the requirements changed (perhaps the need for an Excel sheet in addition to the CSV).
Surprisingly, I don't think it broke all that often? I only really recall one major incident. At some point, the client edited the Excel file given to us in step (1) to remove two columns. That caused quite a lot of scramble, as the scripts and macros were all set up to expect a certain number of columns and certain types of data. The fix for that was just... creating two blank columns in the sheet. That was deemed much easier than trying to update the script or the macro. If I remember right, too -- the script wasn't in version control and just sort of lived out on the live server where it processed the files, which is only mildly terrifying.
By the time I inherited this project, the original developer had passed on (as in, no longer amongst the living -- not just in a new position). I'm not sure how many hands it passed through, but at some point, the process became mine to own.
This was around 2016, and the update system was designed well before that (early 2010s?). So, this was long before all of the fancy data processing "things" were in existence.
Nothing super interesting, just very weird that a process like this was supporting a very huge and multinational company -- and also cool that these days this process would look entirely different and (most likely) entirely automated. Sort of a look-back.