• 1 Post
  • 499 Comments
Joined 1 year ago
cake
Cake day: June 22nd, 2023

help-circle

  • Just for fun: this would have worked so much better if they price dropped the PS5 and introduced the PS5 Pro at the old price.

    People are anchored into thinking the PS5 is a certain value, and if they did that, it would instantly make the PS5 Pro and the PS5 appear to be a bargain, and so much of the PS5-owning public would have bought another system because it would be “such a good deal,” while PS5 fence-sitters would jump at the core system. I’m not trained to say for sure, but I think while their profit margin would be lower they’d be making much more money.





  • It’s another example, along with many other groups, of some base authoritarian or in/out-group mindset superseding all other principles and imperatives.

    For evangelicals, the desire for this authoritarian leader supersedes any imperative to act in a moral or biblically-sanctioned way. For conservatives, the desire supersedes their ideological imperatives of supporting law enforcement and being tough on crime. And for this police organization, that desire supersedes both their professional identities and their loyalty to their own officers, who were directly attacked by Trump’s people.

    It’s morbidly fascinating. Yes, they have “right-wing” in common, but there is a unique betrayal of core principles happening for each of them. There has to be some common psychological need that Trump supplies to all of these different groups.


  • But the researchers then dive head-first into wild claims:

    GameNGen answers one of the important questions on the road towards a new paradigm for game engines, one where games are automatically generated, similarly to how images and videos are generated by neural models in recent years.

    To which the obvious reply is: no it doesn’t, where did you get any of this? You’ve generated three seconds of fake gameplay video where your player shoots something and it shoots back. None of the mechanics of the game work. Nothing other than what’s on-screen can be known to the engine.

    Yeah, this was apparent immediately.

    Diffusion models are just matrices of positional and temporal probabilities. It is absolutely incompatible with even the simplest demands of a game, since any player will reject a game if it lacks reliable and input-deterministic outcomes. The only way to get that reliability is to create a huge amount of training data, and spend exorbitant resources training on it to the point of harshly over-fitting the model to the data, all of which requires that the team first make the game they’re emulating before they start training. It’s nonsense.

    If someone is going to use AI to make a game, they would get exponentially higher ROI using AI to generate code that writes once the relationship between the data, versus inferring the raw data of every individual pixel.

    The demo was always effectively a clickbait novelty for likes.





  • I’m all for people trying, so agree with this.

    But my experience: I haven’t had a good night’s sleep in 10 years (wake up and am half-asleep, half-awake most of the night). I went to a sleep study, did the sleep-in-hospital-with-electrodes-everywhere thing, met with three doctors in series after, and their conclusion was that I should sleep more. I wish I was joking.








  • It’s hard to have a nuanced discussion because the article is so vague. It’s not clear what he’s specifically been charged with (beyond “obscenity,” not a specific child abuse statute?). Because any simulated CSAM laws have been, to my knowledge, all struck down when challenged.

    I completely get the “lock them all up and throw away the key” visceral reaction - I feel that too, for sure - but this is a much more difficult question. There are porn actors over 18 who look younger, do the laws outlaw them from work that would be legal for others who just look older? If AI was trained exclusively on those over-18 people, would outputs then not be CSAM even if the images produced features that looked under 18?

    I’m at least all for a “fruit of the poisoned tree” theory - if AI model training data sets include actual CSAM then they can and should be made illegal. Deepfaking intentionally real under 18 people is also not black and white (looking again to the harm factor), but also I think it can be justifiably prohibited. I also think distribution of completely fake CSAM can be arguably outlawed (the situation here), since it’s going to be impossible to tell AI from real imagery soon and allowing that would undermine enforcement of vital anti-real-CSAM laws.

    The real hard case is producing and retaining fully fake people and without real CSAM in training data, solely locally (possession crimes). That’s really tough. Because not only does it not directly hurt anyone in its creation, there’s a possible benefit in that it diminishes the market for real CSAM (potentially saving unrelated children from the abuse flowing from that demand), and could also divert the impulse of the producer from preying on children around them due to unfulfilled desire.

    Could, because I don’t think there’s studies that answers whether those are true.