I’m not sure if this is an Oscar reference or a Zizek reference and either way I’m here for it.
I’m not sure if this is an Oscar reference or a Zizek reference and either way I’m here for it.
This ten forward? I see last post as 5 months ago…
I understand that but none of the replacements came close to matching the level of content and quality, risa used to always be popping in recent and I’ve not found any of the replacements (I subscribed to them all) to be anywhere near that. Not saying the action wasn’t warranted (it probably was) but still sad.
I don’t really know what drama caused it but when he stopped posting on c/risa@startrek.website it was a sad sad day.
https://arewereorganizedyet.com/ lol already updated
Major Kovalyov: “Am I a joke to you?”
Is that a Mexican poncho or is that a Sears poncho?
Gpt4 via chat gpt
And now I’m wondering in the -ihkal suffix is known outside the Shulgin fanbase…
Toilet paper I have known and loved?
It’s the part of ruby that replaced perl. For whatever eldritch horror perl was it was very, very good at doing text manipulation, and IME the only language to really match that experience was ruby.
On Firefox mobile when I click in to a post now the comment dialog has like 10 characters width max.
If I scroll down at all in a nested chain it quickly goes to 1 character lol
If I refresh then it somewhat goes back to normal.
I don’t get the downvotes. I’ve hired probably 30+ engineers over the last 5 or so years, and have been writing code professionally for over 20, and I fully agree with your sentiment.
Can somebody please swap in a civ v image I really need this ty.
https://github.com/Mozilla-Ocho/Memory-Cache is the actual project if you want to use it.
Basically it’s a firefox extension to save a page as a pdf in a directory that is symlinked to your local PrivateGPT install which then ingests the docs. It doesn’t seem to me that it provides any in-browser querying of PrivateGPT but I haven’t tried setting it up to confirm that.
I think that is overly simplistic. Embeddings used for LLMs do definitely include a concept of what things mean and the relationship of things to other things.
E.g., compare the embeddings of Paris, Athens, and London to other cities and they will have small cosine distance between them. Compare France, Greece, and England and same. Then very interestingly, look at Paris - France, Athens - Greece, London - England and you’ll find the resulting vectors all align (fundamentally the vector operation seems to account for the relationship “is the capital of”). Then go a step further, compare those vector to Paris - US, Athens - US, London - Canada. You’ll see the previous set are not aligned with these nearly as much but these are aligned with each other (relationship being something like “is a smaller city in this countrry, named after a famous city in some other country”)
The way attention works there is a whole bunch of semantic meaning baked into embeddings, and by comparing embeddings you can get to pragmatic meaning as well.
Where is this? California has strict regulations about the actual beach access. So e.g. Pebble Beach is in one of the most beautiful locations in all of Northern California, ridiculously expensive and nearly impossible to play as a mortal, but you can still go drive around 17 mile drive through the course and walk along the coastal trails for free.
Colon, Michigan: “Am I a joke to you?”