• Sir Arthur V Quackington
    link
    fedilink
    612 years ago

    Japanese companies, this isn’t a wish, it’s a fundamental truth of the universe. Like gravity. No matter the scale or importance of them. I promise you your car exists because of an Excel 2003 file on some underpaid engineer’s laptop that they periodically sync with an inventory system.

  • @ZC3rr0r@lemmy.ca
    link
    fedilink
    40
    edit-2
    2 years ago

    Microsoft spent years and years trying to get people to not use Excel as a database, until they eventually had to give up hope that anyone who doesn’t know the difference would voluntarily use Access, so they started adding database-like functionality to Excel to meet their customer’s demands and try to make the experience at least a little bit less painful.

    This is a real-life case of “meet the user where they are” despite the designer’s wishes, because even within Microsoft, there is strong agreement on not using Excel as a DB.

      • @ZC3rr0r@lemmy.ca
        link
        fedilink
        132 years ago

        Well, to be fair to Access, it’s not like Excel is such a great multi-user database either, now is it? ;-)

        • @supercriticalcheese@feddit.it
          cake
          link
          fedilink
          52 years ago

          Well excel nowadays doesn’t have issues with concurrent users if you have office 365 like many companies do.

          At that time it was Access with the files located at a company shared drive, the issue was concurrent writes I believe.

          • @ZC3rr0r@lemmy.ca
            link
            fedilink
            22 years ago

            Yes, but at the time Excel didn’t support concurrency either ;-)

            Anyway, you are correct about the issue with concurrent writes, but that’s only because Access was intended as a single user DB. If you wanted a multi-user DB you should be getting MS SQL server.

            Not saying this product strategy worked (it clearly didn’t, otherwise people would not be using Excel), but that’s how they envisioned it to work.

    • @_number8_@lemmy.world
      link
      fedilink
      2
      edit-2
      2 years ago

      i…isn’t that the entire point of excel? what is it for if not to store data?

      similarly i remember a reddit comment that broke my brain, saying no one should be using excel, they should be using a ‘cell matrix organizer’ or similar. we all can name 5 off the top of our heads

      • @frezik@midwest.social
        link
        fedilink
        82 years ago

        Excel has a purpose, but storing data long term isn’t it. It’s for calculating data. It shouldn’t be the single source of truth.

        One of the things Microsoft did to make it work was extending the row limit from 65k to 1M. Apparently, Economics professors were very excited about that one, which explains a lot.

      • @ZC3rr0r@lemmy.ca
        link
        fedilink
        52 years ago

        Storing data is only one of the parts to the formula of what makes a database. Proper databases require structured storage of the data and some way to query the data constructively. Excel did not have those features until Microsoft gave up trying to convince people to not use it as a DB and added it to Excel.

  • TacoButtPlug
    link
    fedilink
    English
    332 years ago

    This is basically what I run for a living and it’s definitely not glamorous.

    • @eslaf@lemmy.world
      link
      fedilink
      302 years ago

      Employers get what they demand, what they deserve. Anyway excel works as a database until around 1 million entries…

      • @xpinchx@lemmy.world
        link
        fedilink
        312 years ago

        Once you get to a million just start a new one and create a “master” spreadsheet that uses power query to append them all. Problem solved ;)

        Don’t tell anyone but I actually do this.

    • @Confused_Emus@lemmy.world
      link
      fedilink
      English
      82 years ago

      I work as a network tech for a globally spanning ISP specializing in fiber services, handling major maintenances that are service effecting for business and government customers (SLAs are in effect). These maintenances are planned and tracked through various excel sheets - housed either in a shared network drive (so yeah, we may run into issues where multiple people are trying to edit the same doc at once), or excel tables in a SharePoint.

      Prior to the merger of companies I recently went through, we had actual database systems to track this stuff that worked just fine. And now we’re relying on the same shit a grad student would use to track their doctorate progress. It’ll work until it doesn’t. Looking forward to the shit-show if it gets me overtime.

  • Max_Power
    link
    fedilink
    English
    12
    edit-2
    2 years ago

    My 5th rule would be “no ‘fix my IT problem without me telling you what the error message says’”. Because fuck that

    • @SpaceCowboy@lemmy.ca
      link
      fedilink
      52 years ago

      I mean it’s a simple file format so it’ll perform better because it doesn’t have to decode any complex formats or protocols.

      Big O? Never heard of it!

  • @AnUnusualRelic@lemmy.world
    link
    fedilink
    English
    102 years ago

    “I have to make a brochure for the printing shop and I’d like to compose it in Excel”

    “There are actually five rules…”

    “In Powerpoint?”

    “Make that six.”

  • asudox
    link
    fedilink
    72 years ago

    Kind of related question: Is it okay for me to use JSON as a small DB? I just store basic blog page data there.

    • @nierot@lemm.ee
      link
      fedilink
      82 years ago

      I mean it will work, but for a blog I’d store the pages in markdown files, to make it easier to edit. For context, look into how Hugo works

      • asudox
        link
        fedilink
        32 years ago

        I thought of that as well. I might switch to that. It will make the organization better anyways.

    • @slacktoid@lemmy.ml
      link
      fedilink
      English
      32 years ago

      TinyDB literally does this. in general its more of does this work for my use case and am i aware of its limitations.

    • @kono_throwaway_da@sh.itjust.works
      link
      fedilink
      3
      edit-2
      2 years ago

      A few circumstances to consider…

      If it’s just your own little tool and you don’t intend to share it with others: do whatever you want. SQL or NoSQL or JSON, it doesn’t matter. Use your own judgement.

      In my experience tho most homegrown JSON-based “databases” tend to load all data into the memory, simply because they are very simplistic (serialize everything into JSON and write to disk, deserialize everything into a struct). If your dataset is too big for that, just go straight for a full-fledged database.

    • Drew
      link
      fedilink
      22 years ago

      yep, though IO might bottleneck you at some point, and then you can happily switch to mongoDB

    • @iwasgodonce@lemmy.world
      link
      fedilink
      22 years ago

      one of our partners we have to integrate with at work sends us reports in ms access format. it’s not fun, especially when everything is running in lambda and there doesn’t seem to be any good libraries for reading ms access files that would easily run in lambda.

    • @Anamnesis@lemmy.world
      link
      fedilink
      72 years ago

      In my experience it doesn’t work well when you have more than a couple people editing the file. My company had a group of ten modifying the same file in live time; it led to huge desync problems.

      • @xpinchx@lemmy.world
        link
        fedilink
        82 years ago

        I live in Excel hell and even that made me shudder. Just work on separate files and have a master spreadsheet append everything with power query.

        I made a similar reply higher up and I fucking hate that that’s a solution but it legitimately would work in this use case. I frequently deal with 1M+ row data sets and our API can only export like 20k rows at a time so I have a script make the pulls into a folder and I just PQ to append the whole fucking folder into one data set. You don’t even have to load the table at that point, you can pull as-is from the data model to BI or make a pivot or whatever else you’re trying to do with that much data.

        • Parent company doesn’t want ANYONE to have direct read access to the database - only the scant few heavily formatted reports the user-facing software will allow. Data analysis still needs to get done though, so…

          Yeah. PQ -> Data Model saves my ass and my co-workers think I’m a wizard.

          That, and learning how to quietly exploit minor vulnerabilities in the software to get raw tables I “shouldn’t” have and telling not one soul has been a winning combo!

  • @betamark@lemmy.world
    link
    fedilink
    32 years ago

    I imagine an alternate 4th panel wherein the genie says “ok you can bring back dead people.” What do yall think? Also I bet we could come up with a themed genie or setting that would punch up the joke too. ♡♡ love it BTW, op.