• 33 Posts
  • 195 Comments
Joined 2 years ago
cake
Cake day: July 1st, 2023

help-circle
  • Some of these employers think they’re hot shit and gaslight applicants into thinking this is an appropriate process that everyone should go through if they want to make ‘the big bucks’. “oh look at me im an industry giant megacorporation/private gov contracted defense sector/researcher academia/silicon valley tech! I can have my pick of the litter, just gotta make HR come up with a big enough maze for the rats run through to string out the weakest ones!”

    No you idiots, the job market is in the hands of the worker selling their labor. The interviewee is the one interviewing the companies, the interviewee should be the one to make companies sweat from the fear of passing on them, or your skills aren’t all that unique in terms of bargaining chips. If I have to spend more than five minutes applying through your dogshit website or if you even think about wasting my time with long interview stages+handwritten testing like im back in school then you can suck on my nuts and find someone dumb enough to bite because of the bait of a big paycheck. You think your little monkey money bonus and government contracts requiring clearances and working for a recognizable company name is enough for me degrading myself any more than usual with no gaurentee of actually being hired? Nah.



  • No worries I have my achthually… Moments as well. Though here’s a counter perspective. The bytes have to be pulled out of abstraction space and actually mapped to a physical medium capable of storing huge amounts of informational states like a hard drive. It takes genius STEM engineer level human cognition and lots of compute power to create a dataset like WA. This makes the physical devices housing the database unique almost one of a kind objects with immense potential value from a business and consumer use. How much would a wealthy competing business owner pay for a drive containing such lucrative trade secrets assuming its not leaked? Probably more than a comparative weighed brick of gold, but that’s just fun speculation.


  • SmokeyDope@lemmy.worldtocats@lemmy.worldUseful guide
    link
    fedilink
    English
    arrow-up
    3
    arrow-down
    8
    ·
    edit-2
    3 days ago

    You shouldn’t be sorry, you didn’t do anything wrong content wise. If anything you helped the community by sparking a important conversation leading to better defined guidelines which I imagine will be updated if this becomes a common enough issue.



  • SmokeyDope@lemmy.worldtomemes@lemmy.worldIf it works, it works.
    link
    fedilink
    English
    arrow-up
    32
    arrow-down
    1
    ·
    edit-2
    3 days ago

    Thank you for the explanation! I never really watched the Olympics enough to see them firing guns. I would think all that high tech equipment counts as performance enhancement stuff which goes against the spirit of peak human based skill but maybe sports people who actually watch and run the Olympics think differently about external augmentations in some cases.

    Its really funny with the context of some dude just chilling and vibing while casually firing off world record level shots






  • Models running on gguf should all work with your gpu assuming its set up correctly and properly loaded into the vram. It shouldnt matter if its qwen or mistral or gemma or llama or llava or stable diffusion. Maybe the engine you are using isnt properly configured to use your arc card so its all just running on your regular ram which limits things? Idk.

    Intel arc gpu might work with kobold and vulcan without any extra technical setup. Its not as deep in the rabbit hole as you may think, a lot of work was put in to making one click executables with nice guis that the average person can work with…

    Models

    Find a bartowlski made quantized gguf of the model you want to use. Q4_km is recommended average quant to try first. Try to make sure it all can fit within your card size wise for speed. Shouldnt be a big problem for you with 20gb vram to play with. Hugging face gives the size in gb next to each quant.

    Start small with like high quant of qwen 3 8b. Then a gemma 12b, then work your way up to a medium quant of deephermes 24b.

    Thinking models are better at math and logical problem solving. But you need to know how to communicate and work with llms to get good results no matter what. Ask it to break down a problem you already solved and test it for comprehension.

    kobold engine

    Download kobold.cpp, execute it like a regular program and adjust settings in graphical interface that pops up. Or make a startup script with flags.

    For input processing library, see if Vulcan processing works with Intel arc. Make sure flash attention is enabled too. Offload all layers of the model I make note of exactly how many layers each model has during startup and specify it but it should figure it out smartly even if not.


  • As an offgrid person with an actual electrical engineering degree who built my system ground up, visiting the diy solar fourms is a trip

    I think offgriders belong in the same crazy camp like healing crystals chicks, and antivaxxers.

    Its funny, I feel the same way about suburbanites and generally neurotypicals who speedrun a college debt right out of highschool for a career path that became over saturated with competition a decade before they applied. Then legally binding themselves to the first fuck buddy to provide emotional support/external validation, poping out two kids, further endebting themselves with unending rent/mortgage payments and using the financial + parental responsibility as an excuse to work a 9-5 for the rest of their lives. I can’t imagine having a life slaved to work with so little to look forward to besides vacations twice a year, watching TV, mowing the grass, bitching about HOA, and buying another car/empty status symbol. All before the age of 25.

    It takes a special kind of crazy or stupid to blindly follow socital status quo of wanting the slop of comfort, convinence, and status. So easily convinced into racking themselves with lifelong debt equating to indentured slavery while giving into your hormonal monkey instincts for creating social bonding family structures in this political/economic climate.



  • Any device someone ask my help with figuring out. Its rarely the appliance that pisses me off and more the blatant learned helplessness and fundimental inability for fellow adults to rub two braincells together on figuring out a new thing or to troubleshoot a simple problem. A lifetime of being the techie fixer bitch slave constantly delegated the responsibility of figuring out everyones crap for them has left me jaded to the average persons mental capacity and basic logical application abilities.


  • For all the verbal fellatio Office Space receives I was expecting it to be a god-like ultimate peak of human culture type deal but in reality it was a mid movie humor and plot wise. Its not bad but its very catery to a specific audience I wasn’t part of. I can see it being one of the first and few relatable films for white collar cubicle boglins at the turn of the century which feels like pretty much the sole reason of why I have to see it occasionally referenced 25 years later.


  • SmokeyDope@lemmy.worldMtoLocalLLaMA@sh.itjust.worksSpecialize LLM
    link
    fedilink
    English
    arrow-up
    2
    ·
    edit-2
    12 days ago

    I would receommend you read over the work of the person who finetuned a mistral model on many us army field guides to understand what fine tuning on a lot of books to bake in knowledge looks like.

    If you are a newbie just learning how this technology works I would suggest trying to get RAG working with a small model and one or two books converted to a big text file just to see how it works. Because its cheap/free t9 just do some tool calling and fill up a models context.

    Once you have a little more experience and if you are financially well off to the point 1-2 thousand dollars to train a model is who-cares whatever play money to you then go for finetuning.


  • It is indeed possible! The nerd speak for what you want to do is ‘finetune training with a dataset’ the dataset being your books. Its a non-trivial task that takes setup and money to pay a training provider to use their compute. There are no gaurentees it will come out the way you want on first bake either.

    A soft version of this thats the big talk right now is RAG which is essentially a way for your llm to call and reference an external dataset to recall information into its active context. Its a useful tool worth looking into much easier and cheaper than model training but while your model can recall information with RAG it won’t really be able to build an internal understanding of that information within its abstraction space. Like being able to recall a piece of information vs internally understanding the concepts its trying to convey. RAG is for wrote memorization, training is for deeper abstraction space mapping



  • There are some pretty close physical analogs that are fun to think about. You cant move a black hole by exerting physical force on it in the normal way so practically infinite gravity wells are like a immovable “object”, though if you’re sufficently nerdy enough you can cook some fun ways to harness its gravitational rotation into a kind of engine, or throw another black hole at it to create a big explosion and some gravitational waves which are like a kind of unstoppable force moving at the speed of light.